Jun 13 04:50:40 crc systemd[1]: Starting Kubernetes Kubelet... Jun 13 04:50:40 crc restorecon[4633]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:40 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jun 13 04:50:41 crc restorecon[4633]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jun 13 04:50:42 crc kubenswrapper[4894]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.008281 4894 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015588 4894 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015640 4894 feature_gate.go:330] unrecognized feature gate: SignatureStores Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015678 4894 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015689 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015700 4894 feature_gate.go:330] unrecognized feature gate: OVNObservability Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015713 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015723 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015732 4894 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015741 4894 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015750 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015760 4894 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015772 4894 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015784 4894 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015794 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015803 4894 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015813 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015823 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015832 4894 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015841 4894 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015850 4894 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015858 4894 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015868 4894 feature_gate.go:330] unrecognized feature gate: NewOLM Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015877 4894 feature_gate.go:330] unrecognized feature gate: PinnedImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015886 4894 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015895 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015907 4894 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015919 4894 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015929 4894 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015939 4894 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015948 4894 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015958 4894 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015968 4894 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015977 4894 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015986 4894 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.015995 4894 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016007 4894 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016018 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016028 4894 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016038 4894 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016047 4894 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016056 4894 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016065 4894 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016075 4894 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016085 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016094 4894 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016103 4894 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016112 4894 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016121 4894 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016130 4894 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016139 4894 feature_gate.go:330] unrecognized feature gate: Example Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016149 4894 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016158 4894 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016171 4894 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016183 4894 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016193 4894 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016202 4894 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016211 4894 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016223 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016232 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016241 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016250 4894 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016259 4894 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016270 4894 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016279 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016288 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016297 4894 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016306 4894 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016317 4894 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016330 4894 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016338 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.016347 4894 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017271 4894 flags.go:64] FLAG: --address="0.0.0.0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017304 4894 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017321 4894 flags.go:64] FLAG: --anonymous-auth="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017333 4894 flags.go:64] FLAG: --application-metrics-count-limit="100" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017347 4894 flags.go:64] FLAG: --authentication-token-webhook="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017358 4894 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017370 4894 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017381 4894 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017391 4894 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017400 4894 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017411 4894 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017420 4894 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017429 4894 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017439 4894 flags.go:64] FLAG: --cgroup-root="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017448 4894 flags.go:64] FLAG: --cgroups-per-qos="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017457 4894 flags.go:64] FLAG: --client-ca-file="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017466 4894 flags.go:64] FLAG: --cloud-config="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017474 4894 flags.go:64] FLAG: --cloud-provider="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017484 4894 flags.go:64] FLAG: --cluster-dns="[]" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017499 4894 flags.go:64] FLAG: --cluster-domain="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017507 4894 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017517 4894 flags.go:64] FLAG: --config-dir="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017525 4894 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017535 4894 flags.go:64] FLAG: --container-log-max-files="5" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017546 4894 flags.go:64] FLAG: --container-log-max-size="10Mi" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017555 4894 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017564 4894 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017574 4894 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017583 4894 flags.go:64] FLAG: --contention-profiling="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017592 4894 flags.go:64] FLAG: --cpu-cfs-quota="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017601 4894 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017610 4894 flags.go:64] FLAG: --cpu-manager-policy="none" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017620 4894 flags.go:64] FLAG: --cpu-manager-policy-options="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017631 4894 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017640 4894 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017649 4894 flags.go:64] FLAG: --enable-debugging-handlers="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017688 4894 flags.go:64] FLAG: --enable-load-reader="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017700 4894 flags.go:64] FLAG: --enable-server="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017710 4894 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017724 4894 flags.go:64] FLAG: --event-burst="100" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017736 4894 flags.go:64] FLAG: --event-qps="50" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017747 4894 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017757 4894 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017769 4894 flags.go:64] FLAG: --eviction-hard="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017783 4894 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017793 4894 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017824 4894 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017838 4894 flags.go:64] FLAG: --eviction-soft="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017850 4894 flags.go:64] FLAG: --eviction-soft-grace-period="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017861 4894 flags.go:64] FLAG: --exit-on-lock-contention="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017902 4894 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017912 4894 flags.go:64] FLAG: --experimental-mounter-path="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017921 4894 flags.go:64] FLAG: --fail-cgroupv1="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017929 4894 flags.go:64] FLAG: --fail-swap-on="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017939 4894 flags.go:64] FLAG: --feature-gates="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017950 4894 flags.go:64] FLAG: --file-check-frequency="20s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017959 4894 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017968 4894 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017977 4894 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017986 4894 flags.go:64] FLAG: --healthz-port="10248" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.017996 4894 flags.go:64] FLAG: --help="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018005 4894 flags.go:64] FLAG: --hostname-override="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018013 4894 flags.go:64] FLAG: --housekeeping-interval="10s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018022 4894 flags.go:64] FLAG: --http-check-frequency="20s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018031 4894 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018040 4894 flags.go:64] FLAG: --image-credential-provider-config="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018049 4894 flags.go:64] FLAG: --image-gc-high-threshold="85" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018058 4894 flags.go:64] FLAG: --image-gc-low-threshold="80" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018079 4894 flags.go:64] FLAG: --image-service-endpoint="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018088 4894 flags.go:64] FLAG: --kernel-memcg-notification="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018098 4894 flags.go:64] FLAG: --kube-api-burst="100" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018107 4894 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018116 4894 flags.go:64] FLAG: --kube-api-qps="50" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018125 4894 flags.go:64] FLAG: --kube-reserved="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018134 4894 flags.go:64] FLAG: --kube-reserved-cgroup="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018143 4894 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018152 4894 flags.go:64] FLAG: --kubelet-cgroups="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018161 4894 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018170 4894 flags.go:64] FLAG: --lock-file="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018178 4894 flags.go:64] FLAG: --log-cadvisor-usage="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018187 4894 flags.go:64] FLAG: --log-flush-frequency="5s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018196 4894 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018212 4894 flags.go:64] FLAG: --log-json-split-stream="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018220 4894 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018229 4894 flags.go:64] FLAG: --log-text-split-stream="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018238 4894 flags.go:64] FLAG: --logging-format="text" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018247 4894 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018257 4894 flags.go:64] FLAG: --make-iptables-util-chains="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018266 4894 flags.go:64] FLAG: --manifest-url="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018274 4894 flags.go:64] FLAG: --manifest-url-header="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018292 4894 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018302 4894 flags.go:64] FLAG: --max-open-files="1000000" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018313 4894 flags.go:64] FLAG: --max-pods="110" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018322 4894 flags.go:64] FLAG: --maximum-dead-containers="-1" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018331 4894 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018340 4894 flags.go:64] FLAG: --memory-manager-policy="None" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018349 4894 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018358 4894 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018368 4894 flags.go:64] FLAG: --node-ip="192.168.126.11" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018378 4894 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018399 4894 flags.go:64] FLAG: --node-status-max-images="50" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018409 4894 flags.go:64] FLAG: --node-status-update-frequency="10s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018418 4894 flags.go:64] FLAG: --oom-score-adj="-999" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018427 4894 flags.go:64] FLAG: --pod-cidr="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018448 4894 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018462 4894 flags.go:64] FLAG: --pod-manifest-path="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018471 4894 flags.go:64] FLAG: --pod-max-pids="-1" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018480 4894 flags.go:64] FLAG: --pods-per-core="0" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018489 4894 flags.go:64] FLAG: --port="10250" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018498 4894 flags.go:64] FLAG: --protect-kernel-defaults="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018507 4894 flags.go:64] FLAG: --provider-id="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018516 4894 flags.go:64] FLAG: --qos-reserved="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018525 4894 flags.go:64] FLAG: --read-only-port="10255" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018534 4894 flags.go:64] FLAG: --register-node="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018543 4894 flags.go:64] FLAG: --register-schedulable="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018552 4894 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018574 4894 flags.go:64] FLAG: --registry-burst="10" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018583 4894 flags.go:64] FLAG: --registry-qps="5" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018592 4894 flags.go:64] FLAG: --reserved-cpus="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018601 4894 flags.go:64] FLAG: --reserved-memory="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018611 4894 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018621 4894 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018630 4894 flags.go:64] FLAG: --rotate-certificates="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018639 4894 flags.go:64] FLAG: --rotate-server-certificates="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018647 4894 flags.go:64] FLAG: --runonce="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018686 4894 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018695 4894 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018705 4894 flags.go:64] FLAG: --seccomp-default="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018714 4894 flags.go:64] FLAG: --serialize-image-pulls="true" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018722 4894 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018732 4894 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018741 4894 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018750 4894 flags.go:64] FLAG: --storage-driver-password="root" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018759 4894 flags.go:64] FLAG: --storage-driver-secure="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018768 4894 flags.go:64] FLAG: --storage-driver-table="stats" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018777 4894 flags.go:64] FLAG: --storage-driver-user="root" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018786 4894 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018796 4894 flags.go:64] FLAG: --sync-frequency="1m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018804 4894 flags.go:64] FLAG: --system-cgroups="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018813 4894 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018839 4894 flags.go:64] FLAG: --system-reserved-cgroup="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018849 4894 flags.go:64] FLAG: --tls-cert-file="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018858 4894 flags.go:64] FLAG: --tls-cipher-suites="[]" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018879 4894 flags.go:64] FLAG: --tls-min-version="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018888 4894 flags.go:64] FLAG: --tls-private-key-file="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018896 4894 flags.go:64] FLAG: --topology-manager-policy="none" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018906 4894 flags.go:64] FLAG: --topology-manager-policy-options="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018915 4894 flags.go:64] FLAG: --topology-manager-scope="container" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018924 4894 flags.go:64] FLAG: --v="2" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018936 4894 flags.go:64] FLAG: --version="false" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018946 4894 flags.go:64] FLAG: --vmodule="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018957 4894 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.018966 4894 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019209 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019220 4894 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019229 4894 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019237 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019244 4894 feature_gate.go:330] unrecognized feature gate: Example Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019253 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019261 4894 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019268 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019276 4894 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019284 4894 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019292 4894 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019300 4894 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019307 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019318 4894 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019328 4894 feature_gate.go:330] unrecognized feature gate: OVNObservability Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019338 4894 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019347 4894 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019358 4894 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019367 4894 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019377 4894 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019387 4894 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019395 4894 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019404 4894 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019413 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019422 4894 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019430 4894 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019441 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019449 4894 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019457 4894 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019465 4894 feature_gate.go:330] unrecognized feature gate: PinnedImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019473 4894 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019481 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019489 4894 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019497 4894 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019505 4894 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019512 4894 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019521 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019529 4894 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019537 4894 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019545 4894 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019552 4894 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019560 4894 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019568 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019576 4894 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019583 4894 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019591 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019599 4894 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019606 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019614 4894 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019622 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019630 4894 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019638 4894 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019646 4894 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019681 4894 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019689 4894 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019697 4894 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019704 4894 feature_gate.go:330] unrecognized feature gate: SignatureStores Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019712 4894 feature_gate.go:330] unrecognized feature gate: NewOLM Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019721 4894 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019730 4894 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019737 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019746 4894 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019753 4894 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019761 4894 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019768 4894 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019779 4894 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019788 4894 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019796 4894 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019804 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019812 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.019822 4894 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.019845 4894 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.032943 4894 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.032988 4894 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033177 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033193 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033203 4894 feature_gate.go:330] unrecognized feature gate: OVNObservability Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033212 4894 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033221 4894 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033229 4894 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033238 4894 feature_gate.go:330] unrecognized feature gate: SignatureStores Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033246 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033254 4894 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033261 4894 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033269 4894 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033277 4894 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033285 4894 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033294 4894 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033302 4894 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033310 4894 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033318 4894 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033326 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033334 4894 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033342 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033353 4894 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033367 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033375 4894 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033383 4894 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033391 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033399 4894 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033407 4894 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033417 4894 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033425 4894 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033433 4894 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033441 4894 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033448 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033457 4894 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033466 4894 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033486 4894 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033494 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033502 4894 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033513 4894 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033523 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033531 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033539 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033546 4894 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033554 4894 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033562 4894 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033569 4894 feature_gate.go:330] unrecognized feature gate: Example Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033577 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033585 4894 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033593 4894 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033601 4894 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033609 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033616 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033624 4894 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033632 4894 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033641 4894 feature_gate.go:330] unrecognized feature gate: PinnedImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033649 4894 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033681 4894 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033692 4894 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033701 4894 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033710 4894 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033719 4894 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033727 4894 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033737 4894 feature_gate.go:330] unrecognized feature gate: NewOLM Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033745 4894 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033756 4894 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033764 4894 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033774 4894 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033782 4894 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033790 4894 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033800 4894 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033808 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.033827 4894 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.033841 4894 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034218 4894 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034238 4894 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034250 4894 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034261 4894 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034271 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034279 4894 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034288 4894 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034296 4894 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034303 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034311 4894 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034319 4894 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034330 4894 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034340 4894 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034349 4894 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034357 4894 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034366 4894 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034374 4894 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034383 4894 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034391 4894 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034399 4894 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034407 4894 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034415 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034424 4894 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034432 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034440 4894 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034448 4894 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034456 4894 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034464 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034472 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034480 4894 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034488 4894 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034495 4894 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034503 4894 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034511 4894 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034536 4894 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034544 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034552 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034560 4894 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034568 4894 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034576 4894 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034584 4894 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034591 4894 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034599 4894 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034607 4894 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034615 4894 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034626 4894 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034636 4894 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034645 4894 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034711 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034721 4894 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034730 4894 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034739 4894 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034748 4894 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034757 4894 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034765 4894 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034773 4894 feature_gate.go:330] unrecognized feature gate: OVNObservability Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034781 4894 feature_gate.go:330] unrecognized feature gate: PinnedImages Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034789 4894 feature_gate.go:330] unrecognized feature gate: SignatureStores Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034797 4894 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034805 4894 feature_gate.go:330] unrecognized feature gate: NewOLM Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034813 4894 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034821 4894 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034829 4894 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034838 4894 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034849 4894 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034859 4894 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034867 4894 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034877 4894 feature_gate.go:330] unrecognized feature gate: Example Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034885 4894 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034894 4894 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.034914 4894 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.034927 4894 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.036052 4894 server.go:940] "Client rotation is on, will bootstrap in background" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.043153 4894 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.043287 4894 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.048011 4894 server.go:997] "Starting client certificate rotation" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.048065 4894 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.048322 4894 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-23 22:14:05.995235012 +0000 UTC Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.048475 4894 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 4649h23m23.946765617s for next certificate rotation Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.074489 4894 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.079860 4894 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.097822 4894 log.go:25] "Validated CRI v1 runtime API" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.135435 4894 log.go:25] "Validated CRI v1 image API" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.137874 4894 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.143625 4894 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-06-13-04-42-02-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.143715 4894 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.166950 4894 manager.go:217] Machine: {Timestamp:2025-06-13 04:50:42.165113209 +0000 UTC m=+0.611360742 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:af37a81c-fbe7-481a-97c1-991c857af28f BootID:b922c658-a795-4c23-ac0c-edb6a97d57cd Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d3:f3:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d3:f3:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f2:de:57 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f3:0e:5c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:59:df:0d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c8:44:3d Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:8e:71:ca Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:48:c6:a2:36:8b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1a:ad:61:0f:fd:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.167374 4894 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.167548 4894 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.169112 4894 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.169553 4894 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.169632 4894 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.170433 4894 topology_manager.go:138] "Creating topology manager with none policy" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.170452 4894 container_manager_linux.go:303] "Creating device plugin manager" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.171143 4894 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.171222 4894 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.171746 4894 state_mem.go:36] "Initialized new in-memory state store" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.171967 4894 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.176373 4894 kubelet.go:418] "Attempting to sync node with API server" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.176413 4894 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.176475 4894 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.176496 4894 kubelet.go:324] "Adding apiserver pod source" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.176516 4894 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.182099 4894 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.183269 4894 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.184851 4894 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.185448 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.185562 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.185614 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.185648 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187054 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187113 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187130 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187146 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187169 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187184 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187199 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187222 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187241 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187274 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187307 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187343 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.187420 4894 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.188326 4894 server.go:1280] "Started kubelet" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.188471 4894 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.188945 4894 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.189760 4894 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.189802 4894 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jun 13 04:50:42 crc systemd[1]: Started Kubernetes Kubelet. Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.194183 4894 server.go:460] "Adding debug handlers to kubelet server" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.194261 4894 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.194873 4894 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 20:55:08.08349492 +0000 UTC Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.194954 4894 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 3760h4m25.888546992s for next certificate rotation Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.195057 4894 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.195432 4894 volume_manager.go:287] "The desired_state_of_world populator starts" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.195477 4894 volume_manager.go:289] "Starting Kubelet Volume Manager" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.195789 4894 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.197170 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.197576 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.197920 4894 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.196620 4894 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.184880f45913a21c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-06-13 04:50:42.188272156 +0000 UTC m=+0.634519659,LastTimestamp:2025-06-13 04:50:42.188272156 +0000 UTC m=+0.634519659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.198829 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199518 4894 factory.go:153] Registering CRI-O factory Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199554 4894 factory.go:221] Registration of the crio container factory successfully Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199837 4894 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199894 4894 factory.go:55] Registering systemd factory Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199908 4894 factory.go:221] Registration of the systemd container factory successfully Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199941 4894 factory.go:103] Registering Raw factory Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.199963 4894 manager.go:1196] Started watching for new ooms in manager Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.201550 4894 manager.go:319] Starting recovery of all containers Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222539 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222609 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222631 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222650 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222707 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222725 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222743 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222762 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222783 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222801 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222818 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222836 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222856 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222878 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222917 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222935 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222953 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222971 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.222988 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223005 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223024 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223042 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223059 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223081 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223100 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223119 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223158 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223177 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223197 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223230 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223249 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223267 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223285 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223340 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223365 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223384 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223426 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223462 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223482 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223500 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223517 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223536 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223556 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223573 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223594 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223613 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223630 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223649 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223691 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223709 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223728 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223748 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223787 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223807 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223827 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223846 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223865 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223882 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223902 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223919 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223937 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223955 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.223988 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224007 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224026 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224044 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224061 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224078 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224094 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224112 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224132 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224170 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224189 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224208 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224226 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224243 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224261 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224278 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224296 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224313 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224332 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.224352 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226490 4894 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226550 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226576 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226598 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226620 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226643 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226706 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226728 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.226993 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227012 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227035 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227054 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227074 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227094 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227114 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227135 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227154 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227173 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227200 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227221 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227242 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227262 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227281 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227389 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227415 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227439 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227472 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227490 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227511 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227531 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227554 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227574 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227594 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227613 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227633 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227703 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227727 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227745 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227763 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227783 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227800 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227819 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227874 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227894 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227912 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227934 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227961 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227980 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.227999 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228018 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228037 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228056 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228074 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228100 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228125 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228145 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228166 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228184 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228204 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228223 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228244 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228271 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228291 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228316 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228337 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228356 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228375 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228396 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228423 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228444 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228488 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228508 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228529 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228549 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228572 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228593 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228612 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228630 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228649 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228697 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228717 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228737 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228756 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228778 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228799 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228827 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228854 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228873 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228894 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228914 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228935 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228955 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.228983 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229002 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229024 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229044 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229063 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229083 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229102 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229123 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229143 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229164 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229184 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229211 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229231 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229251 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229273 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229292 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229312 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229334 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229353 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229373 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229392 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229411 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229430 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229450 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229470 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229488 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229510 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229530 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229550 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229569 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229589 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229608 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229632 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229674 4894 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229693 4894 reconstruct.go:97] "Volume reconstruction finished" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.229707 4894 reconciler.go:26] "Reconciler: start to sync state" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.237188 4894 manager.go:324] Recovery completed Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.248425 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.250037 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.250094 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.250113 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.251149 4894 cpu_manager.go:225] "Starting CPU manager" policy="none" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.251279 4894 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.251438 4894 state_mem.go:36] "Initialized new in-memory state store" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.264080 4894 policy_none.go:49] "None policy: Start" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.266102 4894 memory_manager.go:170] "Starting memorymanager" policy="None" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.266149 4894 state_mem.go:35] "Initializing new in-memory state store" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.273135 4894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.274740 4894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.275323 4894 status_manager.go:217] "Starting to sync pod status with apiserver" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.275356 4894 kubelet.go:2335] "Starting kubelet main sync loop" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.275407 4894 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.276096 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.276140 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.298628 4894 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.317028 4894 manager.go:334] "Starting Device Plugin manager" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.317766 4894 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.319255 4894 server.go:79] "Starting device plugin registration server" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.319848 4894 eviction_manager.go:189] "Eviction manager: starting control loop" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.319900 4894 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.320216 4894 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.320639 4894 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.320675 4894 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.327004 4894 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.375732 4894 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.375869 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.378759 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.378824 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.378843 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.379053 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.379565 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.379693 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.380463 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.380505 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.380526 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.380666 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.380975 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381047 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381079 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381134 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381166 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381783 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381808 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381816 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.381996 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.382025 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.382067 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.382294 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.382369 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.382420 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.383423 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.383479 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.383488 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.383988 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.384008 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.384017 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.384188 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.384520 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.384580 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.386916 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.386980 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.387009 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.387974 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.388094 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.390193 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.390369 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.390684 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.392896 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.392959 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.392978 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.401092 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.421117 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.422229 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.422291 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.422315 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.422357 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.422974 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432232 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432321 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432391 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432454 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432538 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432730 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432813 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432892 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432949 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.432997 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.433041 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.433075 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.433104 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.433131 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.433162 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534230 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534301 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534347 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534389 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534408 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534417 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534422 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534498 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534514 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534494 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534528 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534496 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534585 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534616 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534692 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534695 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534681 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534754 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534784 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534857 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534884 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534866 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534914 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534922 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534863 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.534917 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.535009 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.535079 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.624058 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.625703 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.625743 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.625756 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.625788 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.626089 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.738978 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.763791 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.785776 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d6d3a2c4c2b4b3cd933d95cc9124884415ce2d57bf7947892b37149c49e52653 WatchSource:0}: Error finding container d6d3a2c4c2b4b3cd933d95cc9124884415ce2d57bf7947892b37149c49e52653: Status 404 returned error can't find the container with id d6d3a2c4c2b4b3cd933d95cc9124884415ce2d57bf7947892b37149c49e52653 Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.792937 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.793637 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-feee3dd6630ad4270a9ec69e7ea08cdac0c3802b6c3369d5fbc6cb226a3539ae WatchSource:0}: Error finding container feee3dd6630ad4270a9ec69e7ea08cdac0c3802b6c3369d5fbc6cb226a3539ae: Status 404 returned error can't find the container with id feee3dd6630ad4270a9ec69e7ea08cdac0c3802b6c3369d5fbc6cb226a3539ae Jun 13 04:50:42 crc kubenswrapper[4894]: E0613 04:50:42.802237 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.806005 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f1d5e1006ddf4d1fe34fd0ca0f60b147bb5b66222a69fbf5d85ad61852fdaba4 WatchSource:0}: Error finding container f1d5e1006ddf4d1fe34fd0ca0f60b147bb5b66222a69fbf5d85ad61852fdaba4: Status 404 returned error can't find the container with id f1d5e1006ddf4d1fe34fd0ca0f60b147bb5b66222a69fbf5d85ad61852fdaba4 Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.808923 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: I0613 04:50:42.815915 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.827752 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-773bb5a721245e9012e6fa82b757025a41733570b4d7ede38be3322129267f44 WatchSource:0}: Error finding container 773bb5a721245e9012e6fa82b757025a41733570b4d7ede38be3322129267f44: Status 404 returned error can't find the container with id 773bb5a721245e9012e6fa82b757025a41733570b4d7ede38be3322129267f44 Jun 13 04:50:42 crc kubenswrapper[4894]: W0613 04:50:42.830512 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2fcce090a3c0f02a923e87c34cf2dd3668bf149899049509e45709015f801238 WatchSource:0}: Error finding container 2fcce090a3c0f02a923e87c34cf2dd3668bf149899049509e45709015f801238: Status 404 returned error can't find the container with id 2fcce090a3c0f02a923e87c34cf2dd3668bf149899049509e45709015f801238 Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.027184 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.028876 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.028960 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.028978 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.029014 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.029644 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Jun 13 04:50:43 crc kubenswrapper[4894]: W0613 04:50:43.046373 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.046501 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.191283 4894 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.279982 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"feee3dd6630ad4270a9ec69e7ea08cdac0c3802b6c3369d5fbc6cb226a3539ae"} Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.281213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d6d3a2c4c2b4b3cd933d95cc9124884415ce2d57bf7947892b37149c49e52653"} Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.282343 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2fcce090a3c0f02a923e87c34cf2dd3668bf149899049509e45709015f801238"} Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.283146 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"773bb5a721245e9012e6fa82b757025a41733570b4d7ede38be3322129267f44"} Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.283920 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f1d5e1006ddf4d1fe34fd0ca0f60b147bb5b66222a69fbf5d85ad61852fdaba4"} Jun 13 04:50:43 crc kubenswrapper[4894]: W0613 04:50:43.331772 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.331856 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:43 crc kubenswrapper[4894]: W0613 04:50:43.501317 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.501398 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.603008 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Jun 13 04:50:43 crc kubenswrapper[4894]: W0613 04:50:43.746544 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.746647 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.830633 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.831820 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.831861 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.831875 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:43 crc kubenswrapper[4894]: I0613 04:50:43.831901 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:43 crc kubenswrapper[4894]: E0613 04:50:43.832560 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.191192 4894 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.289057 4894 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380" exitCode=0 Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.289136 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.289282 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.290749 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.290793 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.290808 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.291746 4894 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="50ec61670fa0f2135b2a61681f5cc1ef380106d4e289cccf3dc24d6a319a495a" exitCode=0 Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.291809 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.291848 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"50ec61670fa0f2135b2a61681f5cc1ef380106d4e289cccf3dc24d6a319a495a"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.293291 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.293319 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.293328 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.295557 4894 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aaa31c2733d26100b34766e083ef09222855cd867c60a77fe94d15628c79c903" exitCode=0 Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.295612 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aaa31c2733d26100b34766e083ef09222855cd867c60a77fe94d15628c79c903"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.295776 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.297336 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.297387 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.297407 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.301239 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.301283 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.301306 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.301328 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.301459 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.302889 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.302908 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.302920 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.307411 4894 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f" exitCode=0 Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.307492 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f"} Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.307684 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.313953 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.314004 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.314014 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.317353 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.319185 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.319212 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:44 crc kubenswrapper[4894]: I0613 04:50:44.319224 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.191448 4894 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:45 crc kubenswrapper[4894]: E0613 04:50:45.203898 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Jun 13 04:50:45 crc kubenswrapper[4894]: W0613 04:50:45.282809 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:45 crc kubenswrapper[4894]: E0613 04:50:45.282893 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.312335 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f67f46d3f37b45173b0cc6d883eaa106b2f57c175f325842b7876355b81484b4"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.312371 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e138ef39aa341e35528ebff11c14ea8661b5d7a06a4c84dd91f2d243e8317aa"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.312383 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"18b8eaa3ccf19686ecbc6df08fcd83cb60e5bfb2f7a251a4e502323751d21aef"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.312393 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3cbc2bb91aafe0fcb8323a43028a1df0e68d2f6dfa62fd108946985dd6c60ac"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.313677 4894 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927" exitCode=0 Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.313755 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.313870 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.314830 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.314868 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.314880 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.315571 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b99b8bd2db5a2a9fbca07f6c9b74e93667246c2ffaaf9f51d8d80cf48de2e951"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.315598 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.316497 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.316521 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.316557 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.318053 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.318065 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1e0094904c9d2809a600ba09abc1c7b0b2d39284882cba2da8023924ddfb06f1"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.318112 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e39f4830776bc235c3e942d066bd8bc87bf5c351191c5736262f3bb7084d15e"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.318129 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f2bc504e1f8f46bbe7eeba44682df3916a8cbc1b58e8163ed1cb9065a741b693"} Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.318139 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323388 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323451 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323470 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323445 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323535 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.323555 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.433455 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.434925 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.434970 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.434983 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:45 crc kubenswrapper[4894]: I0613 04:50:45.435012 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:45 crc kubenswrapper[4894]: E0613 04:50:45.435437 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Jun 13 04:50:45 crc kubenswrapper[4894]: W0613 04:50:45.782929 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Jun 13 04:50:45 crc kubenswrapper[4894]: E0613 04:50:45.783056 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.324018 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6"} Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.324150 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326044 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326097 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326114 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326191 4894 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b" exitCode=0 Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326269 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326603 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326902 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b"} Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326968 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.326989 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327711 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327740 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327754 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327763 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327770 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327774 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327866 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327880 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:46 crc kubenswrapper[4894]: I0613 04:50:46.327891 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.218689 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333746 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"67a833e5ae119f0f69551d11b31de6f2daad4c62c1b5f35c2efd0d257c19d99a"} Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333811 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8548e55d96a6e31dab5d796da491dd8495ffea04fbf09b9b09453afb1252110b"} Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333834 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"10f8cfbc52fb95b99f483869905cc139391e35ae07dfeb77b6ef8d76a8ade4e6"} Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333869 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333894 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.333928 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.337898 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.337952 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.337971 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.339483 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.340265 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.340302 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:47 crc kubenswrapper[4894]: I0613 04:50:47.352892 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.118449 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.343568 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.343619 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"40f1ef4484045b68bf2902e21099f091bc8be12b8096fdc3787ee472cd81e680"} Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.343717 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b991c930bf472eced0e8f5fe8f046080d0b1fd0f81aac9f8661fb07bc5b6a81f"} Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.343737 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.343737 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.345959 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.346025 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.346032 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.346081 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.346098 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.346050 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.386015 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.636531 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.638271 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.638338 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.638362 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:48 crc kubenswrapper[4894]: I0613 04:50:48.638406 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.346818 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.346918 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.346964 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348746 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348789 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348812 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348829 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348851 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.348835 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.841448 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.841802 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.843426 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.843490 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:49 crc kubenswrapper[4894]: I0613 04:50:49.843509 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.271912 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.349782 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.350184 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.351221 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.351280 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.351299 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.352052 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.352113 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.352134 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.423573 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:50 crc kubenswrapper[4894]: I0613 04:50:50.434490 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.353055 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.359156 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.359409 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.359605 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.962725 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.963030 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.964465 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.964588 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:51 crc kubenswrapper[4894]: I0613 04:50:51.964697 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.105408 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:52 crc kubenswrapper[4894]: E0613 04:50:52.327126 4894 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.355446 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.356971 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.357039 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.357065 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.842226 4894 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.842336 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.992870 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.993145 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.994793 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.994850 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:52 crc kubenswrapper[4894]: I0613 04:50:52.994863 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:53 crc kubenswrapper[4894]: I0613 04:50:53.358287 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:53 crc kubenswrapper[4894]: I0613 04:50:53.359797 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:53 crc kubenswrapper[4894]: I0613 04:50:53.359856 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:53 crc kubenswrapper[4894]: I0613 04:50:53.359875 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:53 crc kubenswrapper[4894]: I0613 04:50:53.366481 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:50:54 crc kubenswrapper[4894]: I0613 04:50:54.361388 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:50:54 crc kubenswrapper[4894]: I0613 04:50:54.362753 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:50:54 crc kubenswrapper[4894]: I0613 04:50:54.362798 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:50:54 crc kubenswrapper[4894]: I0613 04:50:54.362815 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:50:56 crc kubenswrapper[4894]: W0613 04:50:56.090488 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.090638 4894 trace.go:236] Trace[1945549835]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Jun-2025 04:50:46.088) (total time: 10001ms): Jun 13 04:50:56 crc kubenswrapper[4894]: Trace[1945549835]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:50:56.090) Jun 13 04:50:56 crc kubenswrapper[4894]: Trace[1945549835]: [10.001694677s] [10.001694677s] END Jun 13 04:50:56 crc kubenswrapper[4894]: E0613 04:50:56.090709 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.192101 4894 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jun 13 04:50:56 crc kubenswrapper[4894]: W0613 04:50:56.396547 4894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.396679 4894 trace.go:236] Trace[2029283674]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Jun-2025 04:50:46.394) (total time: 10001ms): Jun 13 04:50:56 crc kubenswrapper[4894]: Trace[2029283674]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:50:56.396) Jun 13 04:50:56 crc kubenswrapper[4894]: Trace[2029283674]: [10.001815961s] [10.001815961s] END Jun 13 04:50:56 crc kubenswrapper[4894]: E0613 04:50:56.396706 4894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.577211 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.577266 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.581696 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jun 13 04:50:56 crc kubenswrapper[4894]: I0613 04:50:56.581730 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jun 13 04:50:57 crc kubenswrapper[4894]: I0613 04:50:57.362874 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]log ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]etcd ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-api-request-count-filter ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-startkubeinformers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-apiserver-admission-initializer ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/generic-apiserver-start-informers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/priority-and-fairness-config-consumer ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/priority-and-fairness-filter ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/storage-object-count-tracker-hook ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-apiextensions-informers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-apiextensions-controllers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/crd-informer-synced ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-system-namespaces-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-cluster-authentication-info-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-legacy-token-tracking-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-service-ip-repair-controllers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jun 13 04:50:57 crc kubenswrapper[4894]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/priority-and-fairness-config-producer ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/bootstrap-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/start-kube-aggregator-informers ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-status-local-available-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-status-remote-available-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-registration-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-wait-for-first-sync ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-discovery-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/kube-apiserver-autoregistration ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]autoregister-completion ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-openapi-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: [+]poststarthook/apiservice-openapiv3-controller ok Jun 13 04:50:57 crc kubenswrapper[4894]: livez check failed Jun 13 04:50:57 crc kubenswrapper[4894]: I0613 04:50:57.362947 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:00 crc kubenswrapper[4894]: I0613 04:51:00.741051 4894 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jun 13 04:51:01 crc kubenswrapper[4894]: E0613 04:51:01.570392 4894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jun 13 04:51:01 crc kubenswrapper[4894]: E0613 04:51:01.578812 4894 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.580914 4894 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.580947 4894 trace.go:236] Trace[1566420439]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Jun-2025 04:50:50.602) (total time: 10978ms): Jun 13 04:51:01 crc kubenswrapper[4894]: Trace[1566420439]: ---"Objects listed" error: 10978ms (04:51:01.580) Jun 13 04:51:01 crc kubenswrapper[4894]: Trace[1566420439]: [10.978404043s] [10.978404043s] END Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.581204 4894 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.582480 4894 trace.go:236] Trace[685741034]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Jun-2025 04:50:49.912) (total time: 11669ms): Jun 13 04:51:01 crc kubenswrapper[4894]: Trace[685741034]: ---"Objects listed" error: 11669ms (04:51:01.582) Jun 13 04:51:01 crc kubenswrapper[4894]: Trace[685741034]: [11.669885321s] [11.669885321s] END Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.582529 4894 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.616897 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49440->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.616987 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49440->192.168.126.11:17697: read: connection reset by peer" Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.940696 4894 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.947645 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.951235 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.962972 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jun 13 04:51:01 crc kubenswrapper[4894]: I0613 04:51:01.963011 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.185617 4894 apiserver.go:52] "Watching apiserver" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.187644 4894 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188036 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-4668k"] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188438 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188555 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188563 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.188853 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188863 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188913 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.188934 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.189146 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.189173 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.190079 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.194388 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.195705 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.196399 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.197244 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198203 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198483 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198532 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198617 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198561 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.198880 4894 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.199439 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.199504 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.199836 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.214419 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.229418 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.241335 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.254202 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.265361 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.283203 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284373 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284445 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284484 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284517 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284589 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284838 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.284990 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.285014 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.285106 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.285445 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.285639 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286116 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286209 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286277 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286618 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286309 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286754 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.286828 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287043 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287268 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287314 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287358 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287724 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.287889 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.288028 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.288261 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.288490 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.288611 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.288771 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289252 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.289308 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:02.788645633 +0000 UTC m=+21.234893086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289345 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289521 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289369 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289571 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289588 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290189 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.289775 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290024 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290144 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290166 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290215 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290292 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290308 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290728 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290767 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290786 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290939 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.290968 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.291024 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.291345 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.291381 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.291573 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.291819 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292018 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292054 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292075 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292097 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292115 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292130 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292147 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292164 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292181 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292203 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292226 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292242 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292263 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292278 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292293 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292308 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292326 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292341 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292357 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292372 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292390 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292405 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292420 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292445 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292461 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292495 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292512 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292526 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292541 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292555 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292570 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292586 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292601 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292618 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292638 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292669 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292684 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292698 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292712 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292727 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292743 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292757 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292774 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292789 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292815 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292832 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292847 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292863 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292879 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292865 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292894 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.292974 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293006 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293031 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293054 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293079 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293123 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293147 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293174 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293199 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293221 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293242 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293246 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293278 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293298 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293315 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293331 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293358 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293373 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293389 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293405 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293454 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293512 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293534 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293552 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293568 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293583 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293599 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293617 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293633 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293637 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293651 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293726 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293751 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293774 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293802 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293825 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293851 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293877 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293901 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293930 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293954 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293976 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293997 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294020 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294048 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294074 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294097 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294121 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294144 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294167 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294192 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294215 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294239 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294263 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294298 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294322 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294344 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294367 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294392 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294415 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294440 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294463 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294485 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294508 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294534 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294573 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294618 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294641 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294691 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294714 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294738 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294763 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294787 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294809 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294832 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294855 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294881 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294904 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294927 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294949 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294974 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294998 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295023 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295047 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295071 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295097 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295144 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295168 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295193 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295216 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295241 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295271 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295295 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295319 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295343 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295368 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295393 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295415 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295440 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295463 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295489 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295514 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295539 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295562 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295586 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295609 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295632 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295678 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295705 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295729 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295754 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295778 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295803 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295828 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295856 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295881 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295906 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295930 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295957 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295982 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296010 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296064 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296102 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296138 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296167 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48g2\" (UniqueName: \"kubernetes.io/projected/2413124f-bf41-452a-94ff-eda830d6dc91-kube-api-access-s48g2\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296199 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296230 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2413124f-bf41-452a-94ff-eda830d6dc91-hosts-file\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296260 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296301 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296333 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296362 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296390 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296419 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296446 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296472 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296498 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296526 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297249 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297283 4894 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297299 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297325 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297341 4894 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297358 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297374 4894 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297395 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297410 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297429 4894 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297451 4894 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297468 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297484 4894 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297498 4894 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297517 4894 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297531 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297546 4894 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297564 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297582 4894 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297596 4894 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297612 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297631 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297645 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297684 4894 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297700 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297722 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297738 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297756 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297774 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.301994 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306998 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.293825 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308174 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294172 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294525 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294720 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295057 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295453 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.295805 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296084 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296283 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.296542 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297381 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297602 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297687 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.297755 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298032 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298101 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298249 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298263 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308469 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298399 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298459 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298572 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298647 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.298939 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.299064 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.299597 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.299651 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300295 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300284 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300489 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300614 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300893 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.300921 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.301175 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.301219 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.301336 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.301701 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302014 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302152 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302609 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302642 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302655 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.302841 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303072 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303129 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303261 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303351 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303495 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303521 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303747 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303823 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303879 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303902 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.303936 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304114 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304285 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304401 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304565 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304590 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304703 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304708 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.304806 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305020 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305120 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305283 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305312 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305388 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305746 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.305957 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306005 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306239 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306349 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306452 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306517 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306875 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306902 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.306975 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.307161 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.307463 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.307834 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.307874 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308040 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308137 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308142 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308153 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.294373 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308415 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308637 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.308948 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.309543 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.309963 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.309993 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.310063 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.310417 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.310193 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.310906 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.311163 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.311264 4894 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.311326 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:02.811307167 +0000 UTC m=+21.257554630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.311599 4894 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.311638 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:02.811624586 +0000 UTC m=+21.257872049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.312236 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.312586 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.312792 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.314694 4894 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.315493 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.315935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.317776 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.318451 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.318711 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.319266 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.319857 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.320097 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.320386 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.326798 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.327793 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.328070 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.328811 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.329663 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.329756 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.329804 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.329830 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.329844 4894 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.331216 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.328371 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.328365 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.331628 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.331906 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.331968 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.331991 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:02.831969113 +0000 UTC m=+21.278216576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.335696 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.336069 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.335980 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.336175 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.336346 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.336956 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.337168 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.337235 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.337312 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.338179 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.338196 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.338209 4894 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.338245 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:02.838234774 +0000 UTC m=+21.284482237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.338678 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.339012 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.339704 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.340387 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.341082 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.341330 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.341911 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.342163 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.342393 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.342858 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.343149 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.343353 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.343637 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.343892 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.344055 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.344516 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.344768 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.345390 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.345565 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.345772 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.345934 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.346124 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.346350 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.346536 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.347154 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.347318 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.347465 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.347885 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.347944 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.348117 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.349728 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.349921 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.350072 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.350111 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.350779 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.351947 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.354692 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.357344 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.358123 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.358568 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.361337 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.362289 4894 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.362354 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.372172 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.373821 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.374857 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.381702 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.386314 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.388052 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.394147 4894 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6" exitCode=255 Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.394927 4894 scope.go:117] "RemoveContainer" containerID="2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.395134 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6"} Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398483 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398554 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398575 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48g2\" (UniqueName: \"kubernetes.io/projected/2413124f-bf41-452a-94ff-eda830d6dc91-kube-api-access-s48g2\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398590 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2413124f-bf41-452a-94ff-eda830d6dc91-hosts-file\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398632 4894 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398642 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398669 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398679 4894 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398687 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398695 4894 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398704 4894 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398712 4894 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398720 4894 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398728 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398737 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398747 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398755 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398767 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398777 4894 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398786 4894 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398798 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398809 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398820 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398831 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398841 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398851 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398862 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398872 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398883 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398893 4894 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398903 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398911 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398919 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398928 4894 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398935 4894 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398945 4894 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398956 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398964 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398972 4894 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398982 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398990 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.398998 4894 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399009 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399020 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399035 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399047 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399055 4894 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399064 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399072 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399081 4894 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399089 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399097 4894 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399121 4894 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399130 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399140 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399154 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399163 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399172 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399181 4894 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399190 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399198 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399211 4894 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399220 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399228 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399236 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399244 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399254 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399264 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399273 4894 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399282 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399291 4894 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399300 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399308 4894 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399316 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399324 4894 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399335 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399342 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399352 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399360 4894 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399368 4894 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399376 4894 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399383 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399391 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399399 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399407 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399415 4894 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399423 4894 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399432 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399440 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399449 4894 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399457 4894 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399466 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399475 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399483 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399492 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399500 4894 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399508 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399516 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399523 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399532 4894 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399543 4894 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399551 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399559 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399567 4894 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399576 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399584 4894 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399591 4894 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399601 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399609 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399618 4894 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399629 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399653 4894 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399674 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399683 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399691 4894 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399699 4894 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399711 4894 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399719 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399727 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399735 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399745 4894 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399753 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399761 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399771 4894 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399779 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399786 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399794 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399802 4894 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399810 4894 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399817 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399825 4894 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399833 4894 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399841 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399849 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399856 4894 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399864 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399873 4894 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399881 4894 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399889 4894 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399897 4894 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399905 4894 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399914 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399922 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399931 4894 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399940 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399948 4894 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399957 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399966 4894 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399974 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399982 4894 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399990 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.399998 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400006 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400015 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400025 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400032 4894 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400040 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400048 4894 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400056 4894 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400064 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400073 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400081 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400090 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400098 4894 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400106 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400113 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400122 4894 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400175 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2413124f-bf41-452a-94ff-eda830d6dc91-hosts-file\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400210 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400311 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.400721 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.400868 4894 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.410726 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.416046 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48g2\" (UniqueName: \"kubernetes.io/projected/2413124f-bf41-452a-94ff-eda830d6dc91-kube-api-access-s48g2\") pod \"node-resolver-4668k\" (UID: \"2413124f-bf41-452a-94ff-eda830d6dc91\") " pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.417406 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.425378 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.431030 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.438548 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.446954 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5d8314e-c0d3-4fb4-a0c3-205a612208ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cbc2bb91aafe0fcb8323a43028a1df0e68d2f6dfa62fd108946985dd6c60ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e138ef39aa341e35528ebff11c14ea8661b5d7a06a4c84dd91f2d243e8317aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b8eaa3ccf19686ecbc6df08fcd83cb60e5bfb2f7a251a4e502323751d21aef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0613 04:50:55.890925 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0613 04:50:55.892575 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1445056400/tls.crt::/tmp/serving-cert-1445056400/tls.key\\\\\\\"\\\\nI0613 04:51:01.587444 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0613 04:51:01.598034 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0613 04:51:01.598070 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0613 04:51:01.598108 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0613 04:51:01.598121 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0613 04:51:01.609300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0613 04:51:01.609342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0613 04:51:01.609342 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0613 04:51:01.609351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0613 04:51:01.609409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0613 04:51:01.609416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0613 04:51:01.609422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0613 04:51:01.609429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0613 04:51:01.610427 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f67f46d3f37b45173b0cc6d883eaa106b2f57c175f325842b7876355b81484b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.455098 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.462880 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.472063 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.482979 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.491461 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.502452 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.504839 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.512862 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.514895 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4668k" Jun 13 04:51:02 crc kubenswrapper[4894]: W0613 04:51:02.518101 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6e6a7032679fb8742c001b72ebe8d3c8e1017ee6398a85cbb17d9a234c705bbb WatchSource:0}: Error finding container 6e6a7032679fb8742c001b72ebe8d3c8e1017ee6398a85cbb17d9a234c705bbb: Status 404 returned error can't find the container with id 6e6a7032679fb8742c001b72ebe8d3c8e1017ee6398a85cbb17d9a234c705bbb Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.525002 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.525068 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.533794 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.536920 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.602123 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.640085 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gt4w4"] Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.640537 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.643844 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.643932 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.644283 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.646214 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.663430 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.676886 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.686094 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:02 crc kubenswrapper[4894]: W0613 04:51:02.692995 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5fa52f857b1e80138fb1a355bd532d60c8a503e8da4343a09e0d525e797757c8 WatchSource:0}: Error finding container 5fa52f857b1e80138fb1a355bd532d60c8a503e8da4343a09e0d525e797757c8: Status 404 returned error can't find the container with id 5fa52f857b1e80138fb1a355bd532d60c8a503e8da4343a09e0d525e797757c8 Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.695296 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.703162 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.703505 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: W0613 04:51:02.705889 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-77087ff55c4bbbdde48b586cb98bd4ba608a1c06c61a162ed875b18a09cb7464 WatchSource:0}: Error finding container 77087ff55c4bbbdde48b586cb98bd4ba608a1c06c61a162ed875b18a09cb7464: Status 404 returned error can't find the container with id 77087ff55c4bbbdde48b586cb98bd4ba608a1c06c61a162ed875b18a09cb7464 Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.716077 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.725679 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.733676 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.753701 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5d8314e-c0d3-4fb4-a0c3-205a612208ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cbc2bb91aafe0fcb8323a43028a1df0e68d2f6dfa62fd108946985dd6c60ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e138ef39aa341e35528ebff11c14ea8661b5d7a06a4c84dd91f2d243e8317aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b8eaa3ccf19686ecbc6df08fcd83cb60e5bfb2f7a251a4e502323751d21aef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0613 04:50:55.890925 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0613 04:50:55.892575 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1445056400/tls.crt::/tmp/serving-cert-1445056400/tls.key\\\\\\\"\\\\nI0613 04:51:01.587444 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0613 04:51:01.598034 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0613 04:51:01.598070 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0613 04:51:01.598108 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0613 04:51:01.598121 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0613 04:51:01.609300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0613 04:51:01.609342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0613 04:51:01.609342 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0613 04:51:01.609351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0613 04:51:01.609409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0613 04:51:01.609416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0613 04:51:01.609422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0613 04:51:01.609429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0613 04:51:01.610427 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f67f46d3f37b45173b0cc6d883eaa106b2f57c175f325842b7876355b81484b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.762935 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.769292 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d105eb-767c-42ff-9205-0a4cc4a662ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrtd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.803421 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.803495 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtd7\" (UniqueName: \"kubernetes.io/projected/b0d105eb-767c-42ff-9205-0a4cc4a662ae-kube-api-access-jrtd7\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.803517 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0d105eb-767c-42ff-9205-0a4cc4a662ae-serviceca\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.803534 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d105eb-767c-42ff-9205-0a4cc4a662ae-host\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.803637 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:03.803620629 +0000 UTC m=+22.249868102 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903881 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903918 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d105eb-767c-42ff-9205-0a4cc4a662ae-host\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903938 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903957 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903976 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.903993 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtd7\" (UniqueName: \"kubernetes.io/projected/b0d105eb-767c-42ff-9205-0a4cc4a662ae-kube-api-access-jrtd7\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.904009 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0d105eb-767c-42ff-9205-0a4cc4a662ae-serviceca\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.904389 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d105eb-767c-42ff-9205-0a4cc4a662ae-host\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904413 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904453 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904467 4894 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904501 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904518 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904519 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:03.904500459 +0000 UTC m=+22.350747922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904529 4894 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904573 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:03.904554441 +0000 UTC m=+22.350801904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904605 4894 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904625 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:03.904619112 +0000 UTC m=+22.350866575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904425 4894 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: E0613 04:51:02.904647 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:03.904642373 +0000 UTC m=+22.350889836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.904869 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0d105eb-767c-42ff-9205-0a4cc4a662ae-serviceca\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.921745 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtd7\" (UniqueName: \"kubernetes.io/projected/b0d105eb-767c-42ff-9205-0a4cc4a662ae-kube-api-access-jrtd7\") pod \"node-ca-gt4w4\" (UID: \"b0d105eb-767c-42ff-9205-0a4cc4a662ae\") " pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: I0613 04:51:02.961896 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gt4w4" Jun 13 04:51:02 crc kubenswrapper[4894]: W0613 04:51:02.979109 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d105eb_767c_42ff_9205_0a4cc4a662ae.slice/crio-ca46cc974a6ee3b87db78d5d2b4adca15c4a585afaf4d570e3ca2c9fb15f095d WatchSource:0}: Error finding container ca46cc974a6ee3b87db78d5d2b4adca15c4a585afaf4d570e3ca2c9fb15f095d: Status 404 returned error can't find the container with id ca46cc974a6ee3b87db78d5d2b4adca15c4a585afaf4d570e3ca2c9fb15f095d Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.018182 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.032608 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.033959 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.050826 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.064523 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.074284 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.093239 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5d8314e-c0d3-4fb4-a0c3-205a612208ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cbc2bb91aafe0fcb8323a43028a1df0e68d2f6dfa62fd108946985dd6c60ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e138ef39aa341e35528ebff11c14ea8661b5d7a06a4c84dd91f2d243e8317aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b8eaa3ccf19686ecbc6df08fcd83cb60e5bfb2f7a251a4e502323751d21aef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0613 04:50:55.890925 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0613 04:50:55.892575 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1445056400/tls.crt::/tmp/serving-cert-1445056400/tls.key\\\\\\\"\\\\nI0613 04:51:01.587444 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0613 04:51:01.598034 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0613 04:51:01.598070 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0613 04:51:01.598108 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0613 04:51:01.598121 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0613 04:51:01.609300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0613 04:51:01.609342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0613 04:51:01.609342 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0613 04:51:01.609351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0613 04:51:01.609409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0613 04:51:01.609416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0613 04:51:01.609422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0613 04:51:01.609429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0613 04:51:01.610427 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f67f46d3f37b45173b0cc6d883eaa106b2f57c175f325842b7876355b81484b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.104321 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.138399 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.146308 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.153686 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d105eb-767c-42ff-9205-0a4cc4a662ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrtd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.176443 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.199240 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.241579 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d81ac64-f9d4-4798-9cb1-902a0daad01e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8548e55d96a6e31dab5d796da491dd8495ffea04fbf09b9b09453afb1252110b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a833e5ae119f0f69551d11b31de6f2daad4c62c1b5f35c2efd0d257c19d99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991c930bf472eced0e8f5fe8f046080d0b1fd0f81aac9f8661fb07bc5b6a81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f1ef4484045b68bf2902e21099f091bc8be12b8096fdc3787ee472cd81e680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f8cfbc52fb95b99f483869905cc139391e35ae07dfeb77b6ef8d76a8ade4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.260617 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.275877 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.275985 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.279404 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.294952 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.306693 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.320094 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5d8314e-c0d3-4fb4-a0c3-205a612208ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3cbc2bb91aafe0fcb8323a43028a1df0e68d2f6dfa62fd108946985dd6c60ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e138ef39aa341e35528ebff11c14ea8661b5d7a06a4c84dd91f2d243e8317aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b8eaa3ccf19686ecbc6df08fcd83cb60e5bfb2f7a251a4e502323751d21aef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cd96ce731d09d040c73ce9f00ba2367965b0228b2a57738fa7acccadfcaa5b6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0613 04:50:55.890925 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0613 04:50:55.892575 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1445056400/tls.crt::/tmp/serving-cert-1445056400/tls.key\\\\\\\"\\\\nI0613 04:51:01.587444 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0613 04:51:01.598034 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0613 04:51:01.598070 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0613 04:51:01.598108 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0613 04:51:01.598121 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0613 04:51:01.609300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0613 04:51:01.609342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0613 04:51:01.609342 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0613 04:51:01.609351 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0613 04:51:01.609409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0613 04:51:01.609416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0613 04:51:01.609422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0613 04:51:01.609429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0613 04:51:01.610427 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f67f46d3f37b45173b0cc6d883eaa106b2f57c175f325842b7876355b81484b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11de77b134815b1b3aa996e33c5550be1d55e07a6545951658f3fa023ead1c9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.332495 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.346618 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.362133 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.387765 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.397423 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"77087ff55c4bbbdde48b586cb98bd4ba608a1c06c61a162ed875b18a09cb7464"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.398956 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.399304 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gt4w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d105eb-767c-42ff-9205-0a4cc4a662ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrtd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gt4w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.400461 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ada4cbf656a499256a73339fd4bf5d990c0a8cb7df88b5ca5971ee0332cbb9e"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.401320 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.402370 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gt4w4" event={"ID":"b0d105eb-767c-42ff-9205-0a4cc4a662ae","Type":"ContainerStarted","Data":"c8c80b1a360a220fe3942e4cbfd36ae1387fd6c52fe9e15e7df17c7a7b3df11c"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.402392 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gt4w4" event={"ID":"b0d105eb-767c-42ff-9205-0a4cc4a662ae","Type":"ContainerStarted","Data":"ca46cc974a6ee3b87db78d5d2b4adca15c4a585afaf4d570e3ca2c9fb15f095d"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.403241 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-t6vz8"] Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.403562 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.403908 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b6bf467426340b11c029f0fe125c3b78ddde0087704f4cca462f73c36579dd32"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.403949 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dea432bae57061c08837906f73acfe76046af5a0ecd3e024ffb455068c37fd44"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.403959 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5fa52f857b1e80138fb1a355bd532d60c8a503e8da4343a09e0d525e797757c8"} Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.416032 4894 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.416063 4894 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.416085 4894 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.416119 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.416077 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.416130 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.416030 4894 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.416167 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.416198 4894 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.416212 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.416361 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.416773 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/192fcf92-25d2-4664-bb9d-8857929dd084-rootfs\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.416792 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzbg\" (UniqueName: \"kubernetes.io/projected/192fcf92-25d2-4664-bb9d-8857929dd084-kube-api-access-6kzbg\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.416848 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/192fcf92-25d2-4664-bb9d-8857929dd084-mcd-auth-proxy-config\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.417049 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"30e0048c6ecfc587a68d9690891324f7b1f97767e71c41576632a14780c1d093"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.417086 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6e6a7032679fb8742c001b72ebe8d3c8e1017ee6398a85cbb17d9a234c705bbb"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.418095 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4668k" event={"ID":"2413124f-bf41-452a-94ff-eda830d6dc91","Type":"ContainerStarted","Data":"956a1700c6adbaff6430de7319db3be7eae741ac09889b15c654ad21545076e7"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.418128 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4668k" event={"ID":"2413124f-bf41-452a-94ff-eda830d6dc91","Type":"ContainerStarted","Data":"fd09f9b236d5b5f488e8df515785c29bc317565a85763331d8006fc4debbb574"} Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.423361 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8ss9"] Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.424037 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.426393 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dg4pl"] Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.427105 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.429198 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.429817 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.429898 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.430279 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xnlj9"] Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.430899 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.433272 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.433556 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.438007 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.438140 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.438101 4894 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.438042 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.438111 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.447776 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.451981 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d81ac64-f9d4-4798-9cb1-902a0daad01e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8548e55d96a6e31dab5d796da491dd8495ffea04fbf09b9b09453afb1252110b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67a833e5ae119f0f69551d11b31de6f2daad4c62c1b5f35c2efd0d257c19d99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991c930bf472eced0e8f5fe8f046080d0b1fd0f81aac9f8661fb07bc5b6a81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f1ef4484045b68bf2902e21099f091bc8be12b8096fdc3787ee472cd81e680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f8cfbc52fb95b99f483869905cc139391e35ae07dfeb77b6ef8d76a8ade4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6d6e35e8f90b0304b7bc94c80ce204a14da587deb7fb9df8b3dd1de4da99380\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928e4a3ceea6caa024dbdcaeac8294619d1e3b41a6b8ba4c548fea11d837a927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f35f96003709d5a2ea890a9b66e5150ca02160d7af756396b4a2e5c61341ee7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-06-13T04:50:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-06-13T04:50:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.467293 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.488528 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.507672 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517383 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cnibin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517521 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517591 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517689 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-k8s-cni-cncf-io\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517781 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.517890 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518001 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518112 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsp6\" (UniqueName: \"kubernetes.io/projected/b773a3b3-2a9d-437f-bc37-8f06b26b7714-kube-api-access-6vsp6\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518242 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-netns\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518338 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-hostroot\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518410 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-etc-kubernetes\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518505 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518620 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518723 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518850 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.518962 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519070 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-system-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519176 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-multus-certs\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519284 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519394 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519501 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.519626 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520123 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/192fcf92-25d2-4664-bb9d-8857929dd084-rootfs\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520258 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzbg\" (UniqueName: \"kubernetes.io/projected/192fcf92-25d2-4664-bb9d-8857929dd084-kube-api-access-6kzbg\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520437 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520562 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-os-release\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520682 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-bin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520791 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-multus\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520899 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521020 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521135 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521250 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-system-cni-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521540 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cnibin\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521642 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521758 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-socket-dir-parent\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521864 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-daemon-config\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.521975 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522079 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhk58\" (UniqueName: \"kubernetes.io/projected/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-kube-api-access-bhk58\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522179 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522300 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522402 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cni-binary-copy\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522504 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-conf-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522605 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/192fcf92-25d2-4664-bb9d-8857929dd084-mcd-auth-proxy-config\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522748 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522860 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.522969 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8m4h\" (UniqueName: \"kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.523068 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-os-release\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.523169 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-kubelet\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.520223 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/192fcf92-25d2-4664-bb9d-8857929dd084-rootfs\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.527015 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.591685 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4668k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2413124f-bf41-452a-94ff-eda830d6dc91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s48g2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:51:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4668k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.623853 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624058 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-os-release\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624165 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-bin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624255 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-multus\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624345 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624440 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624532 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624619 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-system-cni-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624743 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cnibin\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624852 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.624991 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-socket-dir-parent\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.625117 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-daemon-config\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.625258 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.625393 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhk58\" (UniqueName: \"kubernetes.io/projected/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-kube-api-access-bhk58\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.625545 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626116 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626262 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cni-binary-copy\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626391 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-conf-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626506 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-bin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626614 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-os-release\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626696 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-cni-multus\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626711 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626818 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-socket-dir-parent\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626853 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.626897 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627243 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627382 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-daemon-config\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627445 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627390 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627746 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627801 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627740 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8m4h\" (UniqueName: \"kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627839 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-cnibin\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627863 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cni-binary-copy\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627863 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-os-release\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627809 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-system-cni-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627751 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627926 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-kubelet\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627927 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-os-release\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627930 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627949 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-conf-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627972 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-var-lib-kubelet\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.627977 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-multus-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628032 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628057 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628085 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cnibin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628111 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-cnibin\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628162 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628206 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628235 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628290 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628302 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628344 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628364 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsp6\" (UniqueName: \"kubernetes.io/projected/b773a3b3-2a9d-437f-bc37-8f06b26b7714-kube-api-access-6vsp6\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628381 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-k8s-cni-cncf-io\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628395 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-hostroot\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628413 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-etc-kubernetes\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628429 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-netns\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628439 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-k8s-cni-cncf-io\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628446 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628466 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628490 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-hostroot\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628486 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628510 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628535 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628550 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628564 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-system-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628580 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-multus-certs\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628580 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-netns\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628595 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628608 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628623 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628634 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-etc-kubernetes\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628646 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628686 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628718 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628731 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628746 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-system-cni-dir\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628822 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628853 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-host-run-multus-certs\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.628882 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.629147 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b773a3b3-2a9d-437f-bc37-8f06b26b7714-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.629322 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.632951 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.635404 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.661174 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhk58\" (UniqueName: \"kubernetes.io/projected/b06b223d-8b15-48b3-ab96-5cf1b76fbcbd-kube-api-access-bhk58\") pod \"multus-xnlj9\" (UID: \"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd\") " pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.684723 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsp6\" (UniqueName: \"kubernetes.io/projected/b773a3b3-2a9d-437f-bc37-8f06b26b7714-kube-api-access-6vsp6\") pod \"multus-additional-cni-plugins-dg4pl\" (UID: \"b773a3b3-2a9d-437f-bc37-8f06b26b7714\") " pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.700998 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8m4h\" (UniqueName: \"kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h\") pod \"ovnkube-node-n8ss9\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.731873 4894 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58f6d7fb-8209-45ca-bab4-1411f44356e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-06-13T04:50:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffd3ad4edcad2663aa50beabde1ca93c3877db9318f1a3fb83e604b35bb57e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcee6d8218e422e4bd930243a74480a90fc8a4854314524a72346c0b7e7193a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022af7ab196f3ab07dbac8fe63010fff9f2f2bd0efeb46fe83f5558a8cb8727b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdb6518ae29b6ee1324de4533df3fc9202ba2ccee251f77d6ad3f05c164d0e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-06-13T04:50:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-06-13T04:50:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.756321 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.763176 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.765924 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d44566_8b68_4321_aec1_c8f73ead6c7c.slice/crio-e63133071753a8d8f4423b52ad1092600ebf49c45da1cd198013a8f9d5d1eada WatchSource:0}: Error finding container e63133071753a8d8f4423b52ad1092600ebf49c45da1cd198013a8f9d5d1eada: Status 404 returned error can't find the container with id e63133071753a8d8f4423b52ad1092600ebf49c45da1cd198013a8f9d5d1eada Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.768834 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xnlj9" Jun 13 04:51:03 crc kubenswrapper[4894]: W0613 04:51:03.785094 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06b223d_8b15_48b3_ab96_5cf1b76fbcbd.slice/crio-5f8887ef4d4dfd5fc3599e413b54e7b910577c7108e864a48f34bb26af674f9a WatchSource:0}: Error finding container 5f8887ef4d4dfd5fc3599e413b54e7b910577c7108e864a48f34bb26af674f9a: Status 404 returned error can't find the container with id 5f8887ef4d4dfd5fc3599e413b54e7b910577c7108e864a48f34bb26af674f9a Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.823404 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=1.823365806 podStartE2EDuration="1.823365806s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:03.792630329 +0000 UTC m=+22.238877792" watchObservedRunningTime="2025-06-13 04:51:03.823365806 +0000 UTC m=+22.269613299" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.833255 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.833576 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.83355189 +0000 UTC m=+24.279799393 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.934816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.934866 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.934918 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:03 crc kubenswrapper[4894]: I0613 04:51:03.934942 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935076 4894 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935145 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.93512506 +0000 UTC m=+24.381372543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935515 4894 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935563 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.935552612 +0000 UTC m=+24.381800085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935627 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935693 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935708 4894 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935743 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.935734317 +0000 UTC m=+24.381981790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935804 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935815 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935825 4894 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:03 crc kubenswrapper[4894]: E0613 04:51:03.935853 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.93584487 +0000 UTC m=+24.382092353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.071914 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.071895395 podStartE2EDuration="1.071895395s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:04.070785963 +0000 UTC m=+22.517033426" watchObservedRunningTime="2025-06-13 04:51:04.071895395 +0000 UTC m=+22.518142848" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.128285 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b"] Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.128719 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.146683 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.151914 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4dj8k"] Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.152265 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.152314 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.167076 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237554 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjsx4\" (UniqueName: \"kubernetes.io/projected/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-kube-api-access-wjsx4\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237594 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237639 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237690 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237709 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.237733 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9slp4\" (UniqueName: \"kubernetes.io/projected/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-kube-api-access-9slp4\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.275833 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.276113 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.276479 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.276616 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.279526 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.280182 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.281276 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.281911 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.282876 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.283378 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.283950 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.284883 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.285447 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.286329 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.286853 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.287856 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.288338 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.288845 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.289683 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.290160 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.291053 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.291466 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.292018 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.293026 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.293456 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.295279 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.295704 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.296800 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.297223 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.298186 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.298861 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.299671 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.300237 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.300728 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.301520 4894 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.301618 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.303183 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.304028 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.304473 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.305916 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.306950 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.307466 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.308429 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.309136 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.309921 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.310492 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.311539 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.312129 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.312945 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.313444 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.314331 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.315050 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.315864 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.316324 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.317123 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.317594 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.318132 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.318937 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.333728 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4668k" podStartSLOduration=3.3337104379999998 podStartE2EDuration="3.333710438s" podCreationTimestamp="2025-06-13 04:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:04.3330839 +0000 UTC m=+22.779331363" watchObservedRunningTime="2025-06-13 04:51:04.333710438 +0000 UTC m=+22.779957901" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338423 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338479 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338508 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338525 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9slp4\" (UniqueName: \"kubernetes.io/projected/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-kube-api-access-9slp4\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338570 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjsx4\" (UniqueName: \"kubernetes.io/projected/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-kube-api-access-wjsx4\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.338588 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.338904 4894 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.339007 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs podName:c225ccc7-9659-4c3e-9256-af46e1dd1cd6 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:04.83897876 +0000 UTC m=+23.285226223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs") pod "network-metrics-daemon-4dj8k" (UID: "c225ccc7-9659-4c3e-9256-af46e1dd1cd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.339496 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.340382 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.344173 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.349929 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.400917 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9slp4\" (UniqueName: \"kubernetes.io/projected/df5bd0a1-01e3-4086-ae11-57b2f17ef1dc-kube-api-access-9slp4\") pod \"ovnkube-control-plane-749d76644c-wg74b\" (UID: \"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.426694 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnlj9" event={"ID":"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd","Type":"ContainerStarted","Data":"566e4df392011bdd259e91c4add736fd3ea48b8ea29ced0871cd3f8bd459d669"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.426748 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnlj9" event={"ID":"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd","Type":"ContainerStarted","Data":"5f8887ef4d4dfd5fc3599e413b54e7b910577c7108e864a48f34bb26af674f9a"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.428122 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="ff2f25e4d6c20e51c06e70d94b8835fe51f5658a464c2c2c2edc099fd54bf618" exitCode=0 Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.428202 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"ff2f25e4d6c20e51c06e70d94b8835fe51f5658a464c2c2c2edc099fd54bf618"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.428231 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerStarted","Data":"e04076a62c25bf73450ec8b2837bc7d462cff4a10c396f2a08ed52a96173726a"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.431384 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" exitCode=0 Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.431494 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.431550 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"e63133071753a8d8f4423b52ad1092600ebf49c45da1cd198013a8f9d5d1eada"} Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.432846 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjsx4\" (UniqueName: \"kubernetes.io/projected/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-kube-api-access-wjsx4\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.440220 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.479105 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.4790841 podStartE2EDuration="3.4790841s" podCreationTimestamp="2025-06-13 04:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:04.479046989 +0000 UTC m=+22.925294452" watchObservedRunningTime="2025-06-13 04:51:04.4790841 +0000 UTC m=+22.925331573" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.488279 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.493679 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/192fcf92-25d2-4664-bb9d-8857929dd084-mcd-auth-proxy-config\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.518570 4894 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.518686 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls podName:192fcf92-25d2-4664-bb9d-8857929dd084 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.018644612 +0000 UTC m=+23.464892085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls") pod "machine-config-daemon-t6vz8" (UID: "192fcf92-25d2-4664-bb9d-8857929dd084") : failed to sync secret cache: timed out waiting for the condition Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.635271 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gt4w4" podStartSLOduration=2.635248995 podStartE2EDuration="2.635248995s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:04.604081186 +0000 UTC m=+23.050328649" watchObservedRunningTime="2025-06-13 04:51:04.635248995 +0000 UTC m=+23.081496458" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.742710 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xnlj9" podStartSLOduration=2.742693535 podStartE2EDuration="2.742693535s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:04.742215141 +0000 UTC m=+23.188462604" watchObservedRunningTime="2025-06-13 04:51:04.742693535 +0000 UTC m=+23.188940998" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.796758 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.803015 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzbg\" (UniqueName: \"kubernetes.io/projected/192fcf92-25d2-4664-bb9d-8857929dd084-kube-api-access-6kzbg\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.844100 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.844316 4894 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:04 crc kubenswrapper[4894]: E0613 04:51:04.844439 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs podName:c225ccc7-9659-4c3e-9256-af46e1dd1cd6 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:05.844422279 +0000 UTC m=+24.290669742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs") pod "network-metrics-daemon-4dj8k" (UID: "c225ccc7-9659-4c3e-9256-af46e1dd1cd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:04 crc kubenswrapper[4894]: I0613 04:51:04.887184 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.002977 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.046121 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.049520 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/192fcf92-25d2-4664-bb9d-8857929dd084-proxy-tls\") pod \"machine-config-daemon-t6vz8\" (UID: \"192fcf92-25d2-4664-bb9d-8857929dd084\") " pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.247785 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.275561 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.275730 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.436690 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.436970 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.436980 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.436988 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.436997 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.437008 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.437923 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" event={"ID":"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc","Type":"ContainerStarted","Data":"23ba45b09fb93ac0d840d57a44e3699b718b6b951bd4f90f6d1693786ccd2761"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.437967 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" event={"ID":"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc","Type":"ContainerStarted","Data":"8d2e11ea509531dec31eee1584b7d0ece9a49f3c01220789ff01a4c21e6046d7"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.437981 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" event={"ID":"df5bd0a1-01e3-4086-ae11-57b2f17ef1dc","Type":"ContainerStarted","Data":"510626aed3b45680517c7f4dec2eec1bb164e481ac5b57d623d3cfca09afe97c"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.439074 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="ecc9929908b972cb72bc6c0ec167c233c9d192951f59d1de365132c2606a40e1" exitCode=0 Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.439126 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"ecc9929908b972cb72bc6c0ec167c233c9d192951f59d1de365132c2606a40e1"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.440460 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.440493 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"691a14c5c139d68a0acda9ce552d1a8e564d0d2fe71c770990de4bb86d5a6724"} Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.473541 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wg74b" podStartSLOduration=2.473521367 podStartE2EDuration="2.473521367s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:05.451227734 +0000 UTC m=+23.897475197" watchObservedRunningTime="2025-06-13 04:51:05.473521367 +0000 UTC m=+23.919768830" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.853952 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.854146 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.854181 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:09.854131326 +0000 UTC m=+28.300378829 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.854297 4894 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.854373 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs podName:c225ccc7-9659-4c3e-9256-af46e1dd1cd6 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:07.854348383 +0000 UTC m=+26.300595876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs") pod "network-metrics-daemon-4dj8k" (UID: "c225ccc7-9659-4c3e-9256-af46e1dd1cd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.954976 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.955013 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.955049 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:05 crc kubenswrapper[4894]: I0613 04:51:05.955071 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955160 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955188 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955197 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955204 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955212 4894 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955216 4894 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955264 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:09.955247123 +0000 UTC m=+28.401494586 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955279 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:09.955273154 +0000 UTC m=+28.401520607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955290 4894 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955322 4894 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955337 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:09.955315875 +0000 UTC m=+28.401563348 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:05 crc kubenswrapper[4894]: E0613 04:51:05.955356 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:09.955347056 +0000 UTC m=+28.401594529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.276361 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.276368 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.276441 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:06 crc kubenswrapper[4894]: E0613 04:51:06.276595 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:06 crc kubenswrapper[4894]: E0613 04:51:06.276764 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:06 crc kubenswrapper[4894]: E0613 04:51:06.277305 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.445552 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"8cba60102d701dd41ad177c026330c56b0166ea912f65e398975e1c1817c60a8"} Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.449806 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="6ea1bf224b85b569822e9388a427e7dba490da88e0deea1d8dd037edb027d422" exitCode=0 Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.449867 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"6ea1bf224b85b569822e9388a427e7dba490da88e0deea1d8dd037edb027d422"} Jun 13 04:51:06 crc kubenswrapper[4894]: I0613 04:51:06.503776 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podStartSLOduration=4.503750226 podStartE2EDuration="4.503750226s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:06.460984052 +0000 UTC m=+24.907231555" watchObservedRunningTime="2025-06-13 04:51:06.503750226 +0000 UTC m=+24.949997699" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.275525 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:07 crc kubenswrapper[4894]: E0613 04:51:07.276009 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.459796 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.461713 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f4547c12d7a9d14342453d5bf41b9122a3111539037690ce5f8ad582b095489d"} Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.465601 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="4e9e3ac22fe6c2e9ccb970606ab0510a83d4c0773314cc8a79bf6eac8631476a" exitCode=0 Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.465726 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"4e9e3ac22fe6c2e9ccb970606ab0510a83d4c0773314cc8a79bf6eac8631476a"} Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.877934 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:07 crc kubenswrapper[4894]: E0613 04:51:07.878210 4894 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:07 crc kubenswrapper[4894]: E0613 04:51:07.878320 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs podName:c225ccc7-9659-4c3e-9256-af46e1dd1cd6 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:11.878290978 +0000 UTC m=+30.324538481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs") pod "network-metrics-daemon-4dj8k" (UID: "c225ccc7-9659-4c3e-9256-af46e1dd1cd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.979992 4894 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.985043 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.985092 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.985112 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.985291 4894 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.993312 4894 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.993613 4894 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.996099 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.996134 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.996161 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.996177 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jun 13 04:51:07 crc kubenswrapper[4894]: I0613 04:51:07.996186 4894 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-06-13T04:51:07Z","lastTransitionTime":"2025-06-13T04:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.067014 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5"] Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.067630 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.070863 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.070873 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.070957 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.074006 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.079349 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.079399 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.079444 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.079547 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.079577 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180285 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180330 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180384 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180423 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180465 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180494 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.180890 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.182114 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.197814 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.203795 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15164a2c-9512-4e64-b9f1-52b25e0e4d0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzqd5\" (UID: \"15164a2c-9512-4e64-b9f1-52b25e0e4d0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.280385 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.280423 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:08 crc kubenswrapper[4894]: E0613 04:51:08.280539 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.281048 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:08 crc kubenswrapper[4894]: E0613 04:51:08.281119 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:08 crc kubenswrapper[4894]: E0613 04:51:08.281208 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.390084 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" Jun 13 04:51:08 crc kubenswrapper[4894]: W0613 04:51:08.447484 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15164a2c_9512_4e64_b9f1_52b25e0e4d0a.slice/crio-9c959a159d00f9d7a16df29e34c4a7c618669bed3b30310f7537062b24422fd3 WatchSource:0}: Error finding container 9c959a159d00f9d7a16df29e34c4a7c618669bed3b30310f7537062b24422fd3: Status 404 returned error can't find the container with id 9c959a159d00f9d7a16df29e34c4a7c618669bed3b30310f7537062b24422fd3 Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.470005 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" event={"ID":"15164a2c-9512-4e64-b9f1-52b25e0e4d0a","Type":"ContainerStarted","Data":"9c959a159d00f9d7a16df29e34c4a7c618669bed3b30310f7537062b24422fd3"} Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.475861 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="717d7f6c76bf365079581c64bb8534331e1e747f999f7da27ab5cc8bc085098f" exitCode=0 Jun 13 04:51:08 crc kubenswrapper[4894]: I0613 04:51:08.476128 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"717d7f6c76bf365079581c64bb8534331e1e747f999f7da27ab5cc8bc085098f"} Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.276013 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:09 crc kubenswrapper[4894]: E0613 04:51:09.276613 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.486696 4894 generic.go:334] "Generic (PLEG): container finished" podID="b773a3b3-2a9d-437f-bc37-8f06b26b7714" containerID="1656ca3b7e3415c262b5f1162418bbcdf3be7165f7dbd6ae5da7b0e86089415a" exitCode=0 Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.487001 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerDied","Data":"1656ca3b7e3415c262b5f1162418bbcdf3be7165f7dbd6ae5da7b0e86089415a"} Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.497447 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" event={"ID":"15164a2c-9512-4e64-b9f1-52b25e0e4d0a","Type":"ContainerStarted","Data":"23aed170031687c063dc65963a75ba7ebeace8618d57f28b2399fa96514d45f4"} Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.550945 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzqd5" podStartSLOduration=7.550925937 podStartE2EDuration="7.550925937s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:09.549599789 +0000 UTC m=+27.995847252" watchObservedRunningTime="2025-06-13 04:51:09.550925937 +0000 UTC m=+27.997173410" Jun 13 04:51:09 crc kubenswrapper[4894]: I0613 04:51:09.903563 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:09 crc kubenswrapper[4894]: E0613 04:51:09.903794 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.903774056 +0000 UTC m=+36.350021529 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.004121 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.004193 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.004247 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.004285 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004403 4894 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004490 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004505 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004561 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004586 4894 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004593 4894 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004518 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.004486011 +0000 UTC m=+36.450733504 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004518 4894 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004714 4894 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004727 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.004693957 +0000 UTC m=+36.450941460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004755 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.004740609 +0000 UTC m=+36.450988102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.004778 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.004767149 +0000 UTC m=+36.451014643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.276332 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.276421 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.276761 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.276465 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.276841 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:10 crc kubenswrapper[4894]: E0613 04:51:10.277048 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.506093 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" event={"ID":"b773a3b3-2a9d-437f-bc37-8f06b26b7714","Type":"ContainerStarted","Data":"723acc35ebba9b7677496a75594dec00057e84e8fa39296cdaf4421880bc9bf7"} Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.511581 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerStarted","Data":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.540745 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dg4pl" podStartSLOduration=8.5407188 podStartE2EDuration="8.5407188s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:10.540110203 +0000 UTC m=+28.986357696" watchObservedRunningTime="2025-06-13 04:51:10.5407188 +0000 UTC m=+28.986966303" Jun 13 04:51:10 crc kubenswrapper[4894]: I0613 04:51:10.587445 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podStartSLOduration=8.587425148 podStartE2EDuration="8.587425148s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:10.585609145 +0000 UTC m=+29.031856638" watchObservedRunningTime="2025-06-13 04:51:10.587425148 +0000 UTC m=+29.033672621" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.276390 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:11 crc kubenswrapper[4894]: E0613 04:51:11.276578 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.514624 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.515668 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.515743 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.545037 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.547832 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.777507 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4dj8k"] Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.777625 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:11 crc kubenswrapper[4894]: E0613 04:51:11.777727 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:11 crc kubenswrapper[4894]: I0613 04:51:11.927701 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:11 crc kubenswrapper[4894]: E0613 04:51:11.927839 4894 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:11 crc kubenswrapper[4894]: E0613 04:51:11.927903 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs podName:c225ccc7-9659-4c3e-9256-af46e1dd1cd6 nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.927883425 +0000 UTC m=+38.374130898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs") pod "network-metrics-daemon-4dj8k" (UID: "c225ccc7-9659-4c3e-9256-af46e1dd1cd6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jun 13 04:51:12 crc kubenswrapper[4894]: I0613 04:51:12.275549 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:12 crc kubenswrapper[4894]: I0613 04:51:12.275649 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:12 crc kubenswrapper[4894]: E0613 04:51:12.277279 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jun 13 04:51:12 crc kubenswrapper[4894]: E0613 04:51:12.277486 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jun 13 04:51:12 crc kubenswrapper[4894]: I0613 04:51:12.517368 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.276724 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.276790 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:13 crc kubenswrapper[4894]: E0613 04:51:13.276950 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4dj8k" podUID="c225ccc7-9659-4c3e-9256-af46e1dd1cd6" Jun 13 04:51:13 crc kubenswrapper[4894]: E0613 04:51:13.277146 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.520947 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.925511 4894 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.925725 4894 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.981477 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.982100 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.983247 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc"] Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.983796 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.988215 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.989184 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.989407 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp"] Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.990130 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.990773 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn"] Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.990984 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.991081 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.991094 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.991157 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jun 13 04:51:13 crc kubenswrapper[4894]: I0613 04:51:13.991369 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.005070 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.006489 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.007922 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-chjqk"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.008356 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.009746 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.012250 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.012295 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.013306 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.017707 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.018035 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.018166 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.018562 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.026037 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.026338 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.028073 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.028459 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.028777 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzgh5"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.029458 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qkkzl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.029989 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.031096 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.031894 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.032635 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.034145 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.039794 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.040285 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fljtp"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.040854 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.040896 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.041762 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.044414 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.045007 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.046689 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.046854 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.046949 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.050211 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9qc97"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.050938 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.051016 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.051224 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.051366 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.051489 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.051791 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052008 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052496 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052619 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052675 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052707 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdx9\" (UniqueName: \"kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052763 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgzc2\" (UniqueName: \"kubernetes.io/projected/063e6164-98d1-4862-8aef-a3544115769f-kube-api-access-hgzc2\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052791 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/063e6164-98d1-4862-8aef-a3544115769f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052814 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063e6164-98d1-4862-8aef-a3544115769f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052836 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052856 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052880 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-node-pullsecrets\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052902 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qts\" (UniqueName: \"kubernetes.io/projected/ee7ab003-adaa-4207-8064-34ad105f5064-kube-api-access-t8qts\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052924 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052983 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-config\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053007 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-images\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053036 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-client\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053063 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/79df336b-3caf-4071-9832-3fddb99896a1-machine-approver-tls\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053084 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-serving-cert\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053104 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-audit-dir\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053128 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-auth-proxy-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053149 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-image-import-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053171 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053190 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-audit\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053228 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqsn\" (UniqueName: \"kubernetes.io/projected/e2b21a60-5e2f-4892-93b1-d3de72586e25-kube-api-access-wxqsn\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053272 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053294 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053352 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053373 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtd4f\" (UniqueName: \"kubernetes.io/projected/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-kube-api-access-rtd4f\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053395 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsc4r\" (UniqueName: \"kubernetes.io/projected/79df336b-3caf-4071-9832-3fddb99896a1-kube-api-access-qsc4r\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053430 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-encryption-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053453 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b21a60-5e2f-4892-93b1-d3de72586e25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052771 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052898 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052930 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.052974 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053025 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053075 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053288 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053329 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.055878 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.056617 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053356 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053378 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.053421 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.058900 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.059873 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.061308 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.061713 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.061762 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062182 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062374 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062516 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062731 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062790 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062878 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.062967 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.063110 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.063966 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.065606 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.065895 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.066173 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.073724 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.073873 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.076077 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.076995 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.077249 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.077429 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.078421 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.078690 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.078886 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079124 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079134 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079244 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079493 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079695 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079795 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.079926 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.092831 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.095193 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.095533 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.095787 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.119226 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.121178 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.122057 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.132594 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.133860 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.132943 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.137759 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.138622 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.148702 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.149213 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154149 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/063e6164-98d1-4862-8aef-a3544115769f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154220 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-policies\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154250 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljl2\" (UniqueName: \"kubernetes.io/projected/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-kube-api-access-6ljl2\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154327 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-webhook-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154343 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154394 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063e6164-98d1-4862-8aef-a3544115769f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154412 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154428 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155316 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155347 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-kube-api-access-5g97k\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155388 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-node-pullsecrets\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155407 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qts\" (UniqueName: \"kubernetes.io/projected/ee7ab003-adaa-4207-8064-34ad105f5064-kube-api-access-t8qts\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155428 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155447 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155466 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155495 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-proxy-tls\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155513 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155531 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.154874 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/063e6164-98d1-4862-8aef-a3544115769f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155547 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-config\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155566 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-images\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155583 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-client\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155598 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-apiservice-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155613 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155637 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/79df336b-3caf-4071-9832-3fddb99896a1-machine-approver-tls\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155638 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-node-pullsecrets\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155666 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-serving-cert\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155684 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-audit-dir\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155708 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdfrd\" (UniqueName: \"kubernetes.io/projected/6506d749-2f93-4065-b77a-8fd46a7494f5-kube-api-access-wdfrd\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155724 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155742 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155757 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155773 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh4pv\" (UniqueName: \"kubernetes.io/projected/08a1918e-6ac6-454c-ab92-26872b549c0e-kube-api-access-mh4pv\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155789 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-auth-proxy-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155806 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-image-import-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155824 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155842 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155860 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-audit\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155877 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqsn\" (UniqueName: \"kubernetes.io/projected/e2b21a60-5e2f-4892-93b1-d3de72586e25-kube-api-access-wxqsn\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155894 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtsj2\" (UniqueName: \"kubernetes.io/projected/afe85b47-c14f-4f2b-8541-2f2230500b42-kube-api-access-xtsj2\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155911 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155935 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.155984 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-client\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156003 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-trusted-ca\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156043 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-serving-cert\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156060 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-config\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156074 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156091 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-images\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156106 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-serving-cert\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156122 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156154 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-encryption-config\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156169 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-dir\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156186 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156201 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28bcf\" (UniqueName: \"kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156231 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156249 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156264 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156280 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afe85b47-c14f-4f2b-8541-2f2230500b42-tmpfs\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156295 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6506d749-2f93-4065-b77a-8fd46a7494f5-serving-cert\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.156533 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-config\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.157227 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-images\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.157342 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-auth-proxy-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.157641 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee7ab003-adaa-4207-8064-34ad105f5064-audit-dir\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.158637 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-image-import-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.158763 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.158793 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08a1918e-6ac6-454c-ab92-26872b549c0e-serving-cert\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.158812 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159233 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159300 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2w5w\" (UniqueName: \"kubernetes.io/projected/98a99c86-d227-48f6-be6f-b0cddd0221ed-kube-api-access-z2w5w\") pod \"downloads-7954f5f757-fljtp\" (UID: \"98a99c86-d227-48f6-be6f-b0cddd0221ed\") " pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159404 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kz8k\" (UniqueName: \"kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159453 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159488 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159530 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtd4f\" (UniqueName: \"kubernetes.io/projected/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-kube-api-access-rtd4f\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159550 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsc4r\" (UniqueName: \"kubernetes.io/projected/79df336b-3caf-4071-9832-3fddb99896a1-kube-api-access-qsc4r\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159578 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.159635 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-audit\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160135 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-config\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160168 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160195 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-encryption-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160232 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b21a60-5e2f-4892-93b1-d3de72586e25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160255 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160273 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czw2k\" (UniqueName: \"kubernetes.io/projected/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-kube-api-access-czw2k\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160288 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160344 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdx9\" (UniqueName: \"kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.160363 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgzc2\" (UniqueName: \"kubernetes.io/projected/063e6164-98d1-4862-8aef-a3544115769f-kube-api-access-hgzc2\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.162325 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.162838 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.163314 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.168043 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79df336b-3caf-4071-9832-3fddb99896a1-config\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.168960 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee7ab003-adaa-4207-8064-34ad105f5064-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.174332 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.190395 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.190833 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.192470 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5tgfk"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.192945 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.193252 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.193527 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.193825 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2dxfq"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.194109 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.195326 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.195613 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.195917 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.196055 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.200755 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7tj44"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.201205 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.201528 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.201726 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.204320 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.205715 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/79df336b-3caf-4071-9832-3fddb99896a1-machine-approver-tls\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.207894 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.209120 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.209230 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.232687 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-serving-cert\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.233038 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.233202 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.233332 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.234934 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.235061 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.236265 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.236568 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063e6164-98d1-4862-8aef-a3544115769f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.237931 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.240576 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.240953 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-etcd-client\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.241149 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ee7ab003-adaa-4207-8064-34ad105f5064-encryption-config\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.241502 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.241749 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.247348 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.247734 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.248105 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.249848 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.250423 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdx9\" (UniqueName: \"kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9\") pod \"route-controller-manager-6576b87f9c-26ls4\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.251627 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.254847 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.255540 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b21a60-5e2f-4892-93b1-d3de72586e25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.258610 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgzc2\" (UniqueName: \"kubernetes.io/projected/063e6164-98d1-4862-8aef-a3544115769f-kube-api-access-hgzc2\") pod \"openshift-apiserver-operator-796bbdcf4f-qjhrn\" (UID: \"063e6164-98d1-4862-8aef-a3544115769f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273332 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-proxy-tls\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273374 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273435 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzngn\" (UniqueName: \"kubernetes.io/projected/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-kube-api-access-kzngn\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273457 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273475 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-client\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273499 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-apiservice-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273517 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273535 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7407ab6-0811-4fc5-9969-e866b4387f88-serving-cert\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273550 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-service-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273570 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdfrd\" (UniqueName: \"kubernetes.io/projected/6506d749-2f93-4065-b77a-8fd46a7494f5-kube-api-access-wdfrd\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273588 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273607 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh4pv\" (UniqueName: \"kubernetes.io/projected/08a1918e-6ac6-454c-ab92-26872b549c0e-kube-api-access-mh4pv\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273623 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273639 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273675 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273699 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtsj2\" (UniqueName: \"kubernetes.io/projected/afe85b47-c14f-4f2b-8541-2f2230500b42-kube-api-access-xtsj2\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273716 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273734 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273755 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/c4bd4fd2-c14d-42f7-819c-84bb722484d0-kube-api-access-2qdw9\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273788 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-client\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273806 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-trusted-ca\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273823 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4bd4fd2-c14d-42f7-819c-84bb722484d0-service-ca-bundle\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273842 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-images\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273860 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-serving-cert\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273876 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-config\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273893 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273912 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-serving-cert\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273931 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273952 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-default-certificate\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273978 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-encryption-config\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.273994 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-dir\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274013 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274030 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28bcf\" (UniqueName: \"kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274048 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274065 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f56ee03-040d-489c-84ff-dfac0df10942-metrics-tls\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274084 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274101 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afe85b47-c14f-4f2b-8541-2f2230500b42-tmpfs\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274117 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6506d749-2f93-4065-b77a-8fd46a7494f5-serving-cert\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274134 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08a1918e-6ac6-454c-ab92-26872b549c0e-serving-cert\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274150 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274167 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-config\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274184 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2w5w\" (UniqueName: \"kubernetes.io/projected/98a99c86-d227-48f6-be6f-b0cddd0221ed-kube-api-access-z2w5w\") pod \"downloads-7954f5f757-fljtp\" (UID: \"98a99c86-d227-48f6-be6f-b0cddd0221ed\") " pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274201 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-serving-cert\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kz8k\" (UniqueName: \"kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274242 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274271 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-config\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274286 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274301 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czw2k\" (UniqueName: \"kubernetes.io/projected/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-kube-api-access-czw2k\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274317 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274333 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-metrics-certs\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274356 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-policies\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274375 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljl2\" (UniqueName: \"kubernetes.io/projected/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-kube-api-access-6ljl2\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274391 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-stats-auth\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274409 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mff6x\" (UniqueName: \"kubernetes.io/projected/f7407ab6-0811-4fc5-9969-e866b4387f88-kube-api-access-mff6x\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274427 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-webhook-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274446 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274464 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274480 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-kube-api-access-5g97k\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274518 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274533 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7407ab6-0811-4fc5-9969-e866b4387f88-config\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274550 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkm5\" (UniqueName: \"kubernetes.io/projected/2f56ee03-040d-489c-84ff-dfac0df10942-kube-api-access-2jkm5\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274565 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274593 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274610 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274639 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.275480 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.276814 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.277089 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.277110 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.277354 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-policies\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.277603 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.277773 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278254 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.274561 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278427 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278471 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsc4r\" (UniqueName: \"kubernetes.io/projected/79df336b-3caf-4071-9832-3fddb99896a1-kube-api-access-qsc4r\") pod \"machine-approver-56656f9798-jz9cp\" (UID: \"79df336b-3caf-4071-9832-3fddb99896a1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278572 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-proxy-tls\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278628 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.278922 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.279077 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.279283 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.279448 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/afe85b47-c14f-4f2b-8541-2f2230500b42-tmpfs\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.280670 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.281228 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.281580 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-config\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.281689 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.289095 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.289525 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-webhook-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.291064 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.292840 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-audit-dir\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.293297 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-images\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.293951 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08a1918e-6ac6-454c-ab92-26872b549c0e-trusted-ca\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.294237 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-etcd-client\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.294809 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.295172 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/afe85b47-c14f-4f2b-8541-2f2230500b42-apiservice-cert\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.295464 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6506d749-2f93-4065-b77a-8fd46a7494f5-serving-cert\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.296187 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.296644 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.296947 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.297328 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bmj6"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.297676 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.297915 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298069 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298221 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298359 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298447 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298520 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.298955 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.299000 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.299291 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-serving-cert\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.299670 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqsn\" (UniqueName: \"kubernetes.io/projected/e2b21a60-5e2f-4892-93b1-d3de72586e25-kube-api-access-wxqsn\") pod \"cluster-samples-operator-665b6dd947-j4qlc\" (UID: \"e2b21a60-5e2f-4892-93b1-d3de72586e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.299775 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qts\" (UniqueName: \"kubernetes.io/projected/ee7ab003-adaa-4207-8064-34ad105f5064-kube-api-access-t8qts\") pod \"apiserver-76f77b778f-tzgh5\" (UID: \"ee7ab003-adaa-4207-8064-34ad105f5064\") " pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.300105 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6506d749-2f93-4065-b77a-8fd46a7494f5-config\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.300551 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtd4f\" (UniqueName: \"kubernetes.io/projected/b6c8af75-ffb5-4e91-9d8c-751ba03f67ba-kube-api-access-rtd4f\") pod \"machine-api-operator-5694c8668f-chjqk\" (UID: \"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.300641 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.301040 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.305502 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.305996 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.306858 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.309247 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-chjqk"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.309364 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.309551 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.309807 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.310442 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.311954 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.313714 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.314912 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzgh5"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.316621 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.317116 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.320418 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.320815 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.323188 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qkkzl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.323210 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.323648 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.325789 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.327484 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.328105 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.328292 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.328364 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-serving-cert\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.328885 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-encryption-config\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.329075 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08a1918e-6ac6-454c-ab92-26872b549c0e-serving-cert\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.329771 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.331161 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fljtp"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.334138 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.334944 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.335216 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.336622 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.336963 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.344352 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.344920 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.345010 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwzgq"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.345635 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.345707 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.361853 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.362326 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.362779 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.365977 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.370749 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.372969 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.374117 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t5g77"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.375066 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ns4bb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.376513 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378291 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-default-certificate\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378400 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378499 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f56ee03-040d-489c-84ff-dfac0df10942-metrics-tls\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378601 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-config\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378703 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-serving-cert\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378829 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-metrics-certs\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.378931 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-stats-auth\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379116 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mff6x\" (UniqueName: \"kubernetes.io/projected/f7407ab6-0811-4fc5-9969-e866b4387f88-kube-api-access-mff6x\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379287 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7407ab6-0811-4fc5-9969-e866b4387f88-config\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379558 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkm5\" (UniqueName: \"kubernetes.io/projected/2f56ee03-040d-489c-84ff-dfac0df10942-kube-api-access-2jkm5\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379698 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379800 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzngn\" (UniqueName: \"kubernetes.io/projected/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-kube-api-access-kzngn\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379875 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-client\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.379984 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7407ab6-0811-4fc5-9969-e866b4387f88-serving-cert\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.380115 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-service-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.380233 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/c4bd4fd2-c14d-42f7-819c-84bb722484d0-kube-api-access-2qdw9\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.380336 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4bd4fd2-c14d-42f7-819c-84bb722484d0-service-ca-bundle\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.380998 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.390602 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7tj44"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.392701 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4bd4fd2-c14d-42f7-819c-84bb722484d0-service-ca-bundle\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.397301 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-stats-auth\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.397426 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-default-certificate\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.403373 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.413627 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-klddm"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.413863 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.414230 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.420286 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cmqdb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.422340 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2dxfq"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.422384 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-z8rxn"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.423078 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.423517 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.424955 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.426941 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.427683 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.428089 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-klddm" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.431700 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7407ab6-0811-4fc5-9969-e866b4387f88-serving-cert\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.432192 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7407ab6-0811-4fc5-9969-e866b4387f88-config\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.433907 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.434316 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bmj6"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.435293 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.435925 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4bd4fd2-c14d-42f7-819c-84bb722484d0-metrics-certs\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.436528 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.438305 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.439635 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.442066 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.444498 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.445638 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.446863 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9qc97"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.448694 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.454766 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.454794 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.454805 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.454838 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.454847 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-klddm"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.455159 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ns4bb"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.456833 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.458044 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwzgq"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.459872 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t5g77"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.461553 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.463554 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.465642 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.484873 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.494928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-serving-cert\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.505493 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.519392 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-client\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.525927 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" event={"ID":"79df336b-3caf-4071-9832-3fddb99896a1","Type":"ContainerStarted","Data":"30ee1fba5e2a4a13feb0bfa65b1e9e89374cbaae1bf7597ff1909b944f437648"} Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.536122 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.541973 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-config\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.544871 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.549302 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.565445 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.576384 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-etcd-service-ca\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.587750 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.599050 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.623903 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.644213 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.663844 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.664203 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.670822 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.690556 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.699058 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-chjqk"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.704162 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jun 13 04:51:14 crc kubenswrapper[4894]: W0613 04:51:14.719083 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063e6164_98d1_4862_8aef_a3544115769f.slice/crio-2d62ae977b09e95c6e102001fbf4ee68f454e89b8950ba8fd0da15094a8bf36c WatchSource:0}: Error finding container 2d62ae977b09e95c6e102001fbf4ee68f454e89b8950ba8fd0da15094a8bf36c: Status 404 returned error can't find the container with id 2d62ae977b09e95c6e102001fbf4ee68f454e89b8950ba8fd0da15094a8bf36c Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.726353 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.728125 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzgh5"] Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.735326 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f56ee03-040d-489c-84ff-dfac0df10942-metrics-tls\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.746840 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.764558 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.786479 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.804517 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.824565 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.850917 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.864793 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.884501 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.924148 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jun 13 04:51:14 crc kubenswrapper[4894]: I0613 04:51:14.968302 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czw2k\" (UniqueName: \"kubernetes.io/projected/d3f8ed2c-0b5d-4d54-86e6-b87e303c166c-kube-api-access-czw2k\") pod \"openshift-config-operator-7777fb866f-hqzfn\" (UID: \"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.004822 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.007161 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28bcf\" (UniqueName: \"kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf\") pod \"oauth-openshift-558db77b4-rwjkb\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.012253 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljl2\" (UniqueName: \"kubernetes.io/projected/6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370-kube-api-access-6ljl2\") pod \"apiserver-7bbb656c7d-7sz2j\" (UID: \"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.019375 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.043486 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.044784 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.049543 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/e68dcd25-6b98-4645-b67e-fb8ef1707f6f-kube-api-access-5g97k\") pod \"machine-config-operator-74547568cd-f95hl\" (UID: \"e68dcd25-6b98-4645-b67e-fb8ef1707f6f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.064844 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.085765 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.105441 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.125142 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.153862 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.175603 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.191301 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.191937 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtsj2\" (UniqueName: \"kubernetes.io/projected/afe85b47-c14f-4f2b-8541-2f2230500b42-kube-api-access-xtsj2\") pod \"packageserver-d55dfcdfc-vb2fx\" (UID: \"afe85b47-c14f-4f2b-8541-2f2230500b42\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.203863 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.209273 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.220214 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.234120 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.244721 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.265403 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.276726 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.278216 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.283435 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2w5w\" (UniqueName: \"kubernetes.io/projected/98a99c86-d227-48f6-be6f-b0cddd0221ed-kube-api-access-z2w5w\") pod \"downloads-7954f5f757-fljtp\" (UID: \"98a99c86-d227-48f6-be6f-b0cddd0221ed\") " pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.283506 4894 request.go:700] Waited for 1.002686333s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.299395 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kz8k\" (UniqueName: \"kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k\") pod \"controller-manager-879f6c89f-kkq2l\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.311102 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.322709 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.325197 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.334770 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.347924 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.380303 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.381065 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.383596 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.424371 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdfrd\" (UniqueName: \"kubernetes.io/projected/6506d749-2f93-4065-b77a-8fd46a7494f5-kube-api-access-wdfrd\") pod \"authentication-operator-69f744f599-9qc97\" (UID: \"6506d749-2f93-4065-b77a-8fd46a7494f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.427964 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.438468 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.442266 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.445506 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: W0613 04:51:15.453845 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f8ed2c_0b5d_4d54_86e6_b87e303c166c.slice/crio-b0f34f5d899ca9167b2ed97b9fd2a76c90a618f6206e42825330e7d4826eff04 WatchSource:0}: Error finding container b0f34f5d899ca9167b2ed97b9fd2a76c90a618f6206e42825330e7d4826eff04: Status 404 returned error can't find the container with id b0f34f5d899ca9167b2ed97b9fd2a76c90a618f6206e42825330e7d4826eff04 Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.464559 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.485598 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.507962 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.508486 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: W0613 04:51:15.534614 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68dcd25_6b98_4645_b67e_fb8ef1707f6f.slice/crio-fb4ed55e1348bad9b732749a5a65b51979e8bbdcfe6deb753f52b923f0eb6643 WatchSource:0}: Error finding container fb4ed55e1348bad9b732749a5a65b51979e8bbdcfe6deb753f52b923f0eb6643: Status 404 returned error can't find the container with id fb4ed55e1348bad9b732749a5a65b51979e8bbdcfe6deb753f52b923f0eb6643 Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.544462 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.549301 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh4pv\" (UniqueName: \"kubernetes.io/projected/08a1918e-6ac6-454c-ab92-26872b549c0e-kube-api-access-mh4pv\") pod \"console-operator-58897d9998-qkkzl\" (UID: \"08a1918e-6ac6-454c-ab92-26872b549c0e\") " pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.553309 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" event={"ID":"e2b21a60-5e2f-4892-93b1-d3de72586e25","Type":"ContainerStarted","Data":"eeb72857c06edea78de3379f4ea61aabdf314993ddd646fd4459bee564434f3a"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.553432 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" event={"ID":"e2b21a60-5e2f-4892-93b1-d3de72586e25","Type":"ContainerStarted","Data":"275d646dcda26464329c9508a2a02e3933a8c6487c88f9254c388dbcab1d2b2d"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.553446 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" event={"ID":"e2b21a60-5e2f-4892-93b1-d3de72586e25","Type":"ContainerStarted","Data":"1af05e2f657a7ccf843276271173ae35d05534cef17fb7713cf6f86497017669"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.564522 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.573284 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" event={"ID":"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba","Type":"ContainerStarted","Data":"b89f64c3c147767a1ddee3bbb9044a65852af46abf50b332fe7900fb2eaa8ec6"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.573329 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" event={"ID":"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba","Type":"ContainerStarted","Data":"b3509bdde1a91090e0c221b61a66ebb3baa277469844258310a56c94ff24a44a"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.573342 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" event={"ID":"b6c8af75-ffb5-4e91-9d8c-751ba03f67ba","Type":"ContainerStarted","Data":"3226220759eeddef0f2394b743b8e7dadadc0054b578ccb70876ee0ca45c2cb1"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.575244 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.583850 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.604869 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.608348 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.611370 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" event={"ID":"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c","Type":"ContainerStarted","Data":"b0f34f5d899ca9167b2ed97b9fd2a76c90a618f6206e42825330e7d4826eff04"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.625989 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.630115 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" event={"ID":"063e6164-98d1-4862-8aef-a3544115769f","Type":"ContainerStarted","Data":"0db87ab8886a47ba59c7352af1fb2f56ca099cae867803dc0e60a5f38f223e65"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.630151 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" event={"ID":"063e6164-98d1-4862-8aef-a3544115769f","Type":"ContainerStarted","Data":"2d62ae977b09e95c6e102001fbf4ee68f454e89b8950ba8fd0da15094a8bf36c"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.632253 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" event={"ID":"79df336b-3caf-4071-9832-3fddb99896a1","Type":"ContainerStarted","Data":"79b357bea1b11397cdb2e986801104a9e89bdb8c68bd71bffb8a77398975ca09"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.632277 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" event={"ID":"79df336b-3caf-4071-9832-3fddb99896a1","Type":"ContainerStarted","Data":"251efbcf229423eea1aea1fe8622cd9141a07031598d10ea1d56cdac032e312f"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.636874 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" event={"ID":"45928378-9580-49fd-8831-f89923a0b98e","Type":"ContainerStarted","Data":"f579f19a9a3c9cfeb2b79c8689d9e859ae95f113ec98aa7a3752a7ab21169d89"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.640295 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.641424 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" event={"ID":"5692fd22-0500-4b57-944f-f440839634cc","Type":"ContainerStarted","Data":"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.641452 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" event={"ID":"5692fd22-0500-4b57-944f-f440839634cc","Type":"ContainerStarted","Data":"054e2cb16241954a4cb5a57daa1a698eb30b95bc753477414e9d86c15408e406"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.641597 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.642744 4894 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-26ls4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.642771 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" podUID="5692fd22-0500-4b57-944f-f440839634cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.644251 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" event={"ID":"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370","Type":"ContainerStarted","Data":"30c73afbbee940dae4aff6f1bb8a27117b47fbb6c36ddeee10222e05307f2008"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.644393 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.650927 4894 generic.go:334] "Generic (PLEG): container finished" podID="ee7ab003-adaa-4207-8064-34ad105f5064" containerID="12bf34ff9d085ee77c7af6f05c93f07644410ab23c78e9d0a379c0d0e97bd41d" exitCode=0 Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.650965 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" event={"ID":"ee7ab003-adaa-4207-8064-34ad105f5064","Type":"ContainerDied","Data":"12bf34ff9d085ee77c7af6f05c93f07644410ab23c78e9d0a379c0d0e97bd41d"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.650984 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" event={"ID":"ee7ab003-adaa-4207-8064-34ad105f5064","Type":"ContainerStarted","Data":"c4a7c9b276acf054dee6b2778d731079d56dbd5feba2689fb36356755d3bf85f"} Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.664214 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.684090 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fljtp"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.686218 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.704639 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.727546 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.744426 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.765286 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.775138 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9qc97"] Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.789908 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.808220 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.827306 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.845510 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.865630 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.885116 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.888720 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qkkzl"] Jun 13 04:51:15 crc kubenswrapper[4894]: W0613 04:51:15.904208 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a1918e_6ac6_454c_ab92_26872b549c0e.slice/crio-cd8368f70c184872db4a98c6033d2c79bc5d5757abf65f97fae0fe13fa49fa46 WatchSource:0}: Error finding container cd8368f70c184872db4a98c6033d2c79bc5d5757abf65f97fae0fe13fa49fa46: Status 404 returned error can't find the container with id cd8368f70c184872db4a98c6033d2c79bc5d5757abf65f97fae0fe13fa49fa46 Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.904307 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.931237 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.947633 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.964334 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jun 13 04:51:15 crc kubenswrapper[4894]: I0613 04:51:15.985956 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.007441 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.024829 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.044828 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.065342 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.086130 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.107509 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.125471 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.144598 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.164420 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.186552 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.208860 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.224259 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.244886 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.266606 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.283694 4894 request.go:700] Waited for 1.90075159s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.285462 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.310686 4894 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.350106 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mff6x\" (UniqueName: \"kubernetes.io/projected/f7407ab6-0811-4fc5-9969-e866b4387f88-kube-api-access-mff6x\") pod \"service-ca-operator-777779d784-fkxrb\" (UID: \"f7407ab6-0811-4fc5-9969-e866b4387f88\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.369509 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkm5\" (UniqueName: \"kubernetes.io/projected/2f56ee03-040d-489c-84ff-dfac0df10942-kube-api-access-2jkm5\") pod \"dns-operator-744455d44c-7tj44\" (UID: \"2f56ee03-040d-489c-84ff-dfac0df10942\") " pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.378178 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzngn\" (UniqueName: \"kubernetes.io/projected/b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24-kube-api-access-kzngn\") pod \"etcd-operator-b45778765-2dxfq\" (UID: \"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.399384 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdw9\" (UniqueName: \"kubernetes.io/projected/c4bd4fd2-c14d-42f7-819c-84bb722484d0-kube-api-access-2qdw9\") pod \"router-default-5444994796-5tgfk\" (UID: \"c4bd4fd2-c14d-42f7-819c-84bb722484d0\") " pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.404716 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.425417 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.445090 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.467041 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.484510 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.502729 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.505278 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.515090 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.516964 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.524418 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.553213 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.586218 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.607626 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620480 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620526 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk5gw\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620588 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbzk\" (UniqueName: \"kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620617 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzml\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-kube-api-access-lmzml\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620696 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpkbq\" (UniqueName: \"kubernetes.io/projected/f4991bf6-26f7-41af-b622-8ed185db7c6a-kube-api-access-wpkbq\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620717 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620742 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620778 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620868 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620899 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620919 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63289114-7b7f-45b9-85ad-2b265f69bdee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620960 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620979 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4991bf6-26f7-41af-b622-8ed185db7c6a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.620996 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621058 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621096 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621116 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4991bf6-26f7-41af-b622-8ed185db7c6a-proxy-tls\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621144 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.621164 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwrg\" (UniqueName: \"kubernetes.io/projected/63289114-7b7f-45b9-85ad-2b265f69bdee-kube-api-access-mlwrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.623558 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.1235419 +0000 UTC m=+35.569789363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.660829 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fljtp" event={"ID":"98a99c86-d227-48f6-be6f-b0cddd0221ed","Type":"ContainerStarted","Data":"8a04b4c23a5dc8ecd95a188ff5cef9dc84d9ba81af6369824911c592e90d1050"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.661109 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fljtp" event={"ID":"98a99c86-d227-48f6-be6f-b0cddd0221ed","Type":"ContainerStarted","Data":"3b60e81fe0bf217250b7a081eb516fd36205f4684c8748f06a55a512d15007ae"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.662022 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.670261 4894 generic.go:334] "Generic (PLEG): container finished" podID="d3f8ed2c-0b5d-4d54-86e6-b87e303c166c" containerID="da1ebd2e0f424088edad20bef1bbd1ba0a22e6415391bfc0abbd5916be67194d" exitCode=0 Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.670339 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" event={"ID":"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c","Type":"ContainerDied","Data":"da1ebd2e0f424088edad20bef1bbd1ba0a22e6415391bfc0abbd5916be67194d"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.683873 4894 patch_prober.go:28] interesting pod/downloads-7954f5f757-fljtp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.683915 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fljtp" podUID="98a99c86-d227-48f6-be6f-b0cddd0221ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.689935 4894 generic.go:334] "Generic (PLEG): container finished" podID="6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370" containerID="f7278f15442e79ea9783b1f5a0ecf8bb99c611c6fd8ac7b390aa37ef0a2eca94" exitCode=0 Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.690013 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" event={"ID":"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370","Type":"ContainerDied","Data":"f7278f15442e79ea9783b1f5a0ecf8bb99c611c6fd8ac7b390aa37ef0a2eca94"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.713509 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" event={"ID":"ee7ab003-adaa-4207-8064-34ad105f5064","Type":"ContainerStarted","Data":"569d266bc6657b316b9b4c610a5651f6a533b47c493e58874ce7761bd8f9bbe6"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.713557 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" event={"ID":"ee7ab003-adaa-4207-8064-34ad105f5064","Type":"ContainerStarted","Data":"4c3f43a600f19004695f095bdde4e2fe4656f450e02b61d0605d854e4d4aa0b8"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721596 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721773 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzml\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-kube-api-access-lmzml\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721801 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngf9\" (UniqueName: \"kubernetes.io/projected/08893f80-1a41-48d7-a510-b2610bf60cae-kube-api-access-kngf9\") pod \"migrator-59844c95c7-rwrdl\" (UID: \"08893f80-1a41-48d7-a510-b2610bf60cae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721817 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-srv-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721851 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721869 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6422760-85db-41eb-a616-9eca3ca624cb-signing-key\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721893 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721912 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721977 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.721994 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63d933d-0902-413b-86d5-75c917cebade-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.722060 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.222031601 +0000 UTC m=+35.668279064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722113 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-plugins-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722137 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722162 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tk9\" (UniqueName: \"kubernetes.io/projected/697c82f5-f7de-4836-b29b-a11d6277a00a-kube-api-access-v5tk9\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722180 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722219 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722274 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdxn\" (UniqueName: \"kubernetes.io/projected/e6422760-85db-41eb-a616-9eca3ca624cb-kube-api-access-jhdxn\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722311 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxfqr\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-kube-api-access-dxfqr\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722347 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722365 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722382 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b63d933d-0902-413b-86d5-75c917cebade-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722419 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722440 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66pp\" (UniqueName: \"kubernetes.io/projected/c484470f-893c-413d-a9bc-74641a4611ca-kube-api-access-k66pp\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722458 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431e2331-d437-478a-aa43-c8b338746412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722497 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-cert\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722512 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjk8\" (UniqueName: \"kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722865 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4991bf6-26f7-41af-b622-8ed185db7c6a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722889 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722910 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8srp\" (UniqueName: \"kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.722963 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723001 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723019 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723043 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723059 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4991bf6-26f7-41af-b622-8ed185db7c6a-proxy-tls\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723096 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-csi-data-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723117 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723138 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723157 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0e212b9-6f8a-424e-a015-3105a00fde55-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723176 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723195 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-node-bootstrap-token\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723212 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-certs\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723389 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmr87\" (UniqueName: \"kubernetes.io/projected/7c367274-5d79-4c8e-a207-980d14875540-kube-api-access-dmr87\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723418 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbzk\" (UniqueName: \"kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723433 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpkbq\" (UniqueName: \"kubernetes.io/projected/f4991bf6-26f7-41af-b622-8ed185db7c6a-kube-api-access-wpkbq\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723486 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723526 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723544 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcvxl\" (UniqueName: \"kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723559 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723576 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fshw\" (UniqueName: \"kubernetes.io/projected/d3324c84-cea5-4046-adfe-563caa78e068-kube-api-access-9fshw\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723593 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-srv-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723608 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723625 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3324c84-cea5-4046-adfe-563caa78e068-metrics-tls\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723641 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723689 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/431e2331-d437-478a-aa43-c8b338746412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723707 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723727 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723746 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnx6k\" (UniqueName: \"kubernetes.io/projected/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-kube-api-access-bnx6k\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723764 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0e212b9-6f8a-424e-a015-3105a00fde55-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723787 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723822 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lpqk\" (UniqueName: \"kubernetes.io/projected/dade6111-6af8-4b41-bd58-ddfe5180eefd-kube-api-access-9lpqk\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723839 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723854 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-registration-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723869 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-socket-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723886 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723905 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63289114-7b7f-45b9-85ad-2b265f69bdee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723922 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-profile-collector-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723937 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwwfd\" (UniqueName: \"kubernetes.io/projected/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-kube-api-access-mwwfd\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723953 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6422760-85db-41eb-a616-9eca3ca624cb-signing-cabundle\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723982 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.723997 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c484470f-893c-413d-a9bc-74641a4611ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724015 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431e2331-d437-478a-aa43-c8b338746412-config\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724093 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63d933d-0902-413b-86d5-75c917cebade-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724142 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724157 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3324c84-cea5-4046-adfe-563caa78e068-config-volume\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724173 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8qf\" (UniqueName: \"kubernetes.io/projected/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-kube-api-access-zq8qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724192 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwm2w\" (UniqueName: \"kubernetes.io/projected/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-kube-api-access-gwm2w\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724206 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-config\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724222 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724246 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-mountpoint-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724272 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwrg\" (UniqueName: \"kubernetes.io/projected/63289114-7b7f-45b9-85ad-2b265f69bdee-kube-api-access-mlwrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724298 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfc96\" (UniqueName: \"kubernetes.io/projected/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-kube-api-access-bfc96\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.724325 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk5gw\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.764416 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.770199 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb"] Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.770247 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.774188 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4991bf6-26f7-41af-b622-8ed185db7c6a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.775981 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.776466 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" event={"ID":"00185c58-85dc-4395-9b1e-9662609bd88a","Type":"ContainerStarted","Data":"0bfd98c0ce4b67a8067b317e429f98c8261ed79df62d2606ccd098b1f821fa2c"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.776498 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" event={"ID":"00185c58-85dc-4395-9b1e-9662609bd88a","Type":"ContainerStarted","Data":"5a5fac4256d9a2eed078b7f88158218c044fba71e8f3627bb804948aed24a577"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.778821 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.782726 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.784269 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.284248766 +0000 UTC m=+35.730496229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.784984 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.786033 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.786499 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.787165 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.789889 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk5gw\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.790497 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.790504 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63289114-7b7f-45b9-85ad-2b265f69bdee-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.791488 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" event={"ID":"e68dcd25-6b98-4645-b67e-fb8ef1707f6f","Type":"ContainerStarted","Data":"d106f039ffaa3231f03600866b25dd45601787206e30dc98f178dc5e68f7b5fb"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.791542 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" event={"ID":"e68dcd25-6b98-4645-b67e-fb8ef1707f6f","Type":"ContainerStarted","Data":"e5c05fa0260440ceb540484d175365779750b6f9015e7ebafa619010f60a4832"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.791552 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" event={"ID":"e68dcd25-6b98-4645-b67e-fb8ef1707f6f","Type":"ContainerStarted","Data":"fb4ed55e1348bad9b732749a5a65b51979e8bbdcfe6deb753f52b923f0eb6643"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.792130 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzml\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-kube-api-access-lmzml\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.792190 4894 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kkq2l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.792220 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.794384 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4991bf6-26f7-41af-b622-8ed185db7c6a-proxy-tls\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.811443 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb34c26-69f2-4fd0-9a68-d2a2c7940e84-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t2ksj\" (UID: \"beb34c26-69f2-4fd0-9a68-d2a2c7940e84\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.814466 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5tgfk" event={"ID":"c4bd4fd2-c14d-42f7-819c-84bb722484d0","Type":"ContainerStarted","Data":"46b21c3180aca3c5e3d0dad5365f7cc8c7a74970ed2f05905a88e483d91a66e4"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.826833 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" event={"ID":"afe85b47-c14f-4f2b-8541-2f2230500b42","Type":"ContainerStarted","Data":"6da26409150fe692646cd0e5b2dcd006bbc0ddda4e15cfe796455b84b5fddd7a"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.826879 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" event={"ID":"afe85b47-c14f-4f2b-8541-2f2230500b42","Type":"ContainerStarted","Data":"2f781b27f892d60b4d8f9c9de5aae676f96f84a4049188f7b1daddb922479cb1"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827103 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827366 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-node-bootstrap-token\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827392 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827426 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-certs\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827442 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmr87\" (UniqueName: \"kubernetes.io/projected/7c367274-5d79-4c8e-a207-980d14875540-kube-api-access-dmr87\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827496 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcvxl\" (UniqueName: \"kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827519 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827537 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fshw\" (UniqueName: \"kubernetes.io/projected/d3324c84-cea5-4046-adfe-563caa78e068-kube-api-access-9fshw\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827553 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-srv-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827567 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827583 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3324c84-cea5-4046-adfe-563caa78e068-metrics-tls\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827597 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827613 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827619 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827630 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/431e2331-d437-478a-aa43-c8b338746412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827661 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827678 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827694 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnx6k\" (UniqueName: \"kubernetes.io/projected/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-kube-api-access-bnx6k\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827715 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0e212b9-6f8a-424e-a015-3105a00fde55-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827733 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lpqk\" (UniqueName: \"kubernetes.io/projected/dade6111-6af8-4b41-bd58-ddfe5180eefd-kube-api-access-9lpqk\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827750 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827764 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-registration-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827783 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-socket-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827802 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6422760-85db-41eb-a616-9eca3ca624cb-signing-cabundle\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827820 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-profile-collector-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827834 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwwfd\" (UniqueName: \"kubernetes.io/projected/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-kube-api-access-mwwfd\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827852 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827867 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c484470f-893c-413d-a9bc-74641a4611ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827883 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431e2331-d437-478a-aa43-c8b338746412-config\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827908 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63d933d-0902-413b-86d5-75c917cebade-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827935 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8qf\" (UniqueName: \"kubernetes.io/projected/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-kube-api-access-zq8qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827961 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3324c84-cea5-4046-adfe-563caa78e068-config-volume\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827976 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.827992 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm2w\" (UniqueName: \"kubernetes.io/projected/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-kube-api-access-gwm2w\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.828014 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.327995518 +0000 UTC m=+35.774242981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828038 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-config\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828067 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-mountpoint-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828102 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfc96\" (UniqueName: \"kubernetes.io/projected/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-kube-api-access-bfc96\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828140 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngf9\" (UniqueName: \"kubernetes.io/projected/08893f80-1a41-48d7-a510-b2610bf60cae-kube-api-access-kngf9\") pod \"migrator-59844c95c7-rwrdl\" (UID: \"08893f80-1a41-48d7-a510-b2610bf60cae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828156 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-srv-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828171 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6422760-85db-41eb-a616-9eca3ca624cb-signing-key\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828189 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828218 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828236 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828266 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63d933d-0902-413b-86d5-75c917cebade-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828298 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-plugins-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828312 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828331 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828345 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tk9\" (UniqueName: \"kubernetes.io/projected/697c82f5-f7de-4836-b29b-a11d6277a00a-kube-api-access-v5tk9\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828376 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828413 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdxn\" (UniqueName: \"kubernetes.io/projected/e6422760-85db-41eb-a616-9eca3ca624cb-kube-api-access-jhdxn\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828430 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxfqr\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-kube-api-access-dxfqr\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828445 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b63d933d-0902-413b-86d5-75c917cebade-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828469 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828489 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66pp\" (UniqueName: \"kubernetes.io/projected/c484470f-893c-413d-a9bc-74641a4611ca-kube-api-access-k66pp\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828506 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431e2331-d437-478a-aa43-c8b338746412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828521 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjk8\" (UniqueName: \"kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828546 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-cert\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828598 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8srp\" (UniqueName: \"kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828628 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.828645 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.829089 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-socket-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.833414 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6422760-85db-41eb-a616-9eca3ca624cb-signing-cabundle\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.837822 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/431e2331-d437-478a-aa43-c8b338746412-config\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.847028 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63d933d-0902-413b-86d5-75c917cebade-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.848167 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3324c84-cea5-4046-adfe-563caa78e068-config-volume\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.848632 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.849062 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.849696 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.850948 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.851269 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.851344 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-csi-data-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.851377 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.851404 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.851423 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0e212b9-6f8a-424e-a015-3105a00fde55-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.852279 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0e212b9-6f8a-424e-a015-3105a00fde55-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.852353 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-csi-data-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.854354 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-config\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.854423 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-mountpoint-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.854723 4894 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vb2fx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" start-of-body= Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.856737 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.857422 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" podUID="afe85b47-c14f-4f2b-8541-2f2230500b42" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.859486 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.861398 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.862984 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.864303 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-plugins-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.864815 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.869837 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.869942 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.870867 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3324c84-cea5-4046-adfe-563caa78e068-metrics-tls\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.871352 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6422760-85db-41eb-a616-9eca3ca624cb-signing-key\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.871668 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63d933d-0902-413b-86d5-75c917cebade-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.872068 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-profile-collector-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.872212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-node-bootstrap-token\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.872327 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" event={"ID":"45928378-9580-49fd-8831-f89923a0b98e","Type":"ContainerStarted","Data":"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.872574 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dade6111-6af8-4b41-bd58-ddfe5180eefd-srv-cert\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.872623 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c367274-5d79-4c8e-a207-980d14875540-certs\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.873022 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/697c82f5-f7de-4836-b29b-a11d6277a00a-srv-cert\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.873087 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.873579 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.875501 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.876458 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.876523 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-registration-dir\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.880414 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.38039817 +0000 UTC m=+35.826645633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.882254 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" event={"ID":"08a1918e-6ac6-454c-ab92-26872b549c0e","Type":"ContainerStarted","Data":"7f7eee0046c0400b5c696972590eb2776047d5aa8bf6f53f52be11895b197456"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.882293 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" event={"ID":"08a1918e-6ac6-454c-ab92-26872b549c0e","Type":"ContainerStarted","Data":"cd8368f70c184872db4a98c6033d2c79bc5d5757abf65f97fae0fe13fa49fa46"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.882600 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.882860 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.887054 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" event={"ID":"6506d749-2f93-4065-b77a-8fd46a7494f5","Type":"ContainerStarted","Data":"d8988a31f49055e0f2a47d469312239b146432806f08e7b20dbcd13e320b119c"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.887082 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" event={"ID":"6506d749-2f93-4065-b77a-8fd46a7494f5","Type":"ContainerStarted","Data":"e9b30b117d74e3085ddf0cbbd758a00eb7f451894ba166d0f545f0e32a4f27d2"} Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.897296 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.900198 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.900774 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwrg\" (UniqueName: \"kubernetes.io/projected/63289114-7b7f-45b9-85ad-2b265f69bdee-kube-api-access-mlwrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v28j\" (UID: \"63289114-7b7f-45b9-85ad-2b265f69bdee\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.900897 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0e212b9-6f8a-424e-a015-3105a00fde55-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.902019 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-cert\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.902209 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.902719 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c484470f-893c-413d-a9bc-74641a4611ca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.914350 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.914673 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/431e2331-d437-478a-aa43-c8b338746412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.925183 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2dxfq"] Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.927553 4894 patch_prober.go:28] interesting pod/console-operator-58897d9998-qkkzl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.927597 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" podUID="08a1918e-6ac6-454c-ab92-26872b549c0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.928386 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpkbq\" (UniqueName: \"kubernetes.io/projected/f4991bf6-26f7-41af-b622-8ed185db7c6a-kube-api-access-wpkbq\") pod \"machine-config-controller-84d6567774-2dbc7\" (UID: \"f4991bf6-26f7-41af-b622-8ed185db7c6a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.936507 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm2w\" (UniqueName: \"kubernetes.io/projected/7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee-kube-api-access-gwm2w\") pod \"csi-hostpathplugin-ns4bb\" (UID: \"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee\") " pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.938058 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbzk\" (UniqueName: \"kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk\") pod \"marketplace-operator-79b997595-tr7lv\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:16 crc kubenswrapper[4894]: W0613 04:51:16.940327 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d821e8_35ad_40c4_bbf0_5e79d0bc1d24.slice/crio-96eb1348a6f881b6153466adb4af7a6daf614babc9894fedb148acb8bda8e4c1 WatchSource:0}: Error finding container 96eb1348a6f881b6153466adb4af7a6daf614babc9894fedb148acb8bda8e4c1: Status 404 returned error can't find the container with id 96eb1348a6f881b6153466adb4af7a6daf614babc9894fedb148acb8bda8e4c1 Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.944995 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.952284 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:16 crc kubenswrapper[4894]: E0613 04:51:16.953708 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.453690284 +0000 UTC m=+35.899937747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.955978 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcvxl\" (UniqueName: \"kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl\") pod \"collect-profiles-29163165-4wqmd\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.986597 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:16 crc kubenswrapper[4894]: I0613 04:51:16.987102 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.008557 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7tj44"] Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.016797 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fshw\" (UniqueName: \"kubernetes.io/projected/d3324c84-cea5-4046-adfe-563caa78e068-kube-api-access-9fshw\") pod \"dns-default-klddm\" (UID: \"d3324c84-cea5-4046-adfe-563caa78e068\") " pod="openshift-dns/dns-default-klddm" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.016988 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwwfd\" (UniqueName: \"kubernetes.io/projected/f5204aa1-eb4d-4300-9e08-9403f16b8c3e-kube-api-access-mwwfd\") pod \"ingress-canary-t5g77\" (UID: \"f5204aa1-eb4d-4300-9e08-9403f16b8c3e\") " pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.029980 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dpxmr\" (UID: \"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.038883 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.043180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngf9\" (UniqueName: \"kubernetes.io/projected/08893f80-1a41-48d7-a510-b2610bf60cae-kube-api-access-kngf9\") pod \"migrator-59844c95c7-rwrdl\" (UID: \"08893f80-1a41-48d7-a510-b2610bf60cae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.054362 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.055341 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.555329206 +0000 UTC m=+36.001576669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.059923 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.065761 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.075160 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8qf\" (UniqueName: \"kubernetes.io/projected/e8208b57-8fd4-43b0-8a1f-a294cde0fcea-kube-api-access-zq8qf\") pod \"openshift-controller-manager-operator-756b6f6bc6-rpbsd\" (UID: \"e8208b57-8fd4-43b0-8a1f-a294cde0fcea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.102253 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tk9\" (UniqueName: \"kubernetes.io/projected/697c82f5-f7de-4836-b29b-a11d6277a00a-kube-api-access-v5tk9\") pod \"olm-operator-6b444d44fb-w4dzh\" (UID: \"697c82f5-f7de-4836-b29b-a11d6277a00a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.117486 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmr87\" (UniqueName: \"kubernetes.io/projected/7c367274-5d79-4c8e-a207-980d14875540-kube-api-access-dmr87\") pod \"machine-config-server-cmqdb\" (UID: \"7c367274-5d79-4c8e-a207-980d14875540\") " pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.121475 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfc96\" (UniqueName: \"kubernetes.io/projected/2cc9c108-e3dd-445f-9f85-ae04a17c68ba-kube-api-access-bfc96\") pod \"kube-storage-version-migrator-operator-b67b599dd-sg69n\" (UID: \"2cc9c108-e3dd-445f-9f85-ae04a17c68ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.152494 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.157874 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.158629 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.159076 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.659058769 +0000 UTC m=+36.105306232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.162992 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/431e2331-d437-478a-aa43-c8b338746412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgtfp\" (UID: \"431e2331-d437-478a-aa43-c8b338746412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.170685 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.183529 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66pp\" (UniqueName: \"kubernetes.io/projected/c484470f-893c-413d-a9bc-74641a4611ca-kube-api-access-k66pp\") pod \"package-server-manager-789f6589d5-jm76g\" (UID: \"c484470f-893c-413d-a9bc-74641a4611ca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.230008 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8srp\" (UniqueName: \"kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp\") pod \"console-f9d7485db-pllr7\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.231426 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.234046 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.243410 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxfqr\" (UniqueName: \"kubernetes.io/projected/d0e212b9-6f8a-424e-a015-3105a00fde55-kube-api-access-dxfqr\") pod \"ingress-operator-5b745b69d9-dbbt2\" (UID: \"d0e212b9-6f8a-424e-a015-3105a00fde55\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.281692 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnx6k\" (UniqueName: \"kubernetes.io/projected/9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f-kube-api-access-bnx6k\") pod \"multus-admission-controller-857f4d67dd-wwzgq\" (UID: \"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.283009 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t5g77" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.283187 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.283679 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmqdb" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.287777 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.288143 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.788128672 +0000 UTC m=+36.234376135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.290518 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjk8\" (UniqueName: \"kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8\") pod \"cni-sysctl-allowlist-ds-z8rxn\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.298173 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.298711 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-klddm" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.305765 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.317291 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lpqk\" (UniqueName: \"kubernetes.io/projected/dade6111-6af8-4b41-bd58-ddfe5180eefd-kube-api-access-9lpqk\") pod \"catalog-operator-68c6474976-c629l\" (UID: \"dade6111-6af8-4b41-bd58-ddfe5180eefd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.326489 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b63d933d-0902-413b-86d5-75c917cebade-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k2zcl\" (UID: \"b63d933d-0902-413b-86d5-75c917cebade\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.341772 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.344431 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdxn\" (UniqueName: \"kubernetes.io/projected/e6422760-85db-41eb-a616-9eca3ca624cb-kube-api-access-jhdxn\") pod \"service-ca-9c57cc56f-6bmj6\" (UID: \"e6422760-85db-41eb-a616-9eca3ca624cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.359526 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.380536 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.412600 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.413107 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:17.913086936 +0000 UTC m=+36.359334389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.453970 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.489246 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.508028 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.514783 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.515150 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.01513698 +0000 UTC m=+36.461384433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.518866 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.518921 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.518990 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.559302 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.582455 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ns4bb"] Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.617295 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.617623 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.117591406 +0000 UTC m=+36.563838869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.617827 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.618108 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.118101701 +0000 UTC m=+36.564349164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.663628 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd"] Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.706357 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j"] Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.723388 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.724013 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.223996675 +0000 UTC m=+36.670244138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.775159 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl"] Jun 13 04:51:17 crc kubenswrapper[4894]: W0613 04:51:17.823519 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dbc6fdc_ec5e_4aec_bfb8_8059b8382cee.slice/crio-f408e8c54fdaa6c69e2be08b2749eaf4c59df923de365372d761595c26963ebe WatchSource:0}: Error finding container f408e8c54fdaa6c69e2be08b2749eaf4c59df923de365372d761595c26963ebe: Status 404 returned error can't find the container with id f408e8c54fdaa6c69e2be08b2749eaf4c59df923de365372d761595c26963ebe Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.829284 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.829744 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.329730995 +0000 UTC m=+36.775978458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: W0613 04:51:17.842868 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e188bf_fe34_404b_8bb3_ec0ca09e013d.slice/crio-f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da WatchSource:0}: Error finding container f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da: Status 404 returned error can't find the container with id f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.854701 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj"] Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.877748 4894 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rwjkb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.877797 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.901231 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-f95hl" podStartSLOduration=14.901216478 podStartE2EDuration="14.901216478s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:17.869183784 +0000 UTC m=+36.315431247" watchObservedRunningTime="2025-06-13 04:51:17.901216478 +0000 UTC m=+36.347463941" Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.931443 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.931694 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.431645585 +0000 UTC m=+36.877893048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.931834 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:17 crc kubenswrapper[4894]: E0613 04:51:17.932102 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.432091918 +0000 UTC m=+36.878339381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.948281 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" event={"ID":"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24","Type":"ContainerStarted","Data":"1c44f0fa3bfeb2c2aab75081c2c9d67d6294cac69d420b17b4a054cc5708ecb4"} Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.948332 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" event={"ID":"b8d821e8-35ad-40c4-bbf0-5e79d0bc1d24","Type":"ContainerStarted","Data":"96eb1348a6f881b6153466adb4af7a6daf614babc9894fedb148acb8bda8e4c1"} Jun 13 04:51:17 crc kubenswrapper[4894]: I0613 04:51:17.992048 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" event={"ID":"58e188bf-fe34-404b-8bb3-ec0ca09e013d","Type":"ContainerStarted","Data":"f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.022916 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qjhrn" podStartSLOduration=16.022895068 podStartE2EDuration="16.022895068s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.021328173 +0000 UTC m=+36.467575636" watchObservedRunningTime="2025-06-13 04:51:18.022895068 +0000 UTC m=+36.469142531" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.023565 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jz9cp" podStartSLOduration=17.023559997 podStartE2EDuration="17.023559997s" podCreationTimestamp="2025-06-13 04:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:17.983402698 +0000 UTC m=+36.429650161" watchObservedRunningTime="2025-06-13 04:51:18.023559997 +0000 UTC m=+36.469807460" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035334 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.035532 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.535498111 +0000 UTC m=+36.981745574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035715 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035764 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035825 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035865 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.035888 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.036167 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.53615394 +0000 UTC m=+36.982401403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.037298 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.049027 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.050304 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.059951 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.071605 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" event={"ID":"d3f8ed2c-0b5d-4d54-86e6-b87e303c166c","Type":"ContainerStarted","Data":"c900a80b8388ea74facdbd7b085acb1a6dc483a27215a0a00d75af5606b58837"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.082764 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.111821 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5tgfk" event={"ID":"c4bd4fd2-c14d-42f7-819c-84bb722484d0","Type":"ContainerStarted","Data":"5b3fa9adcd3ed8d7585ef1c13d7e90956e9af6a7fed95d9fbd120558a0441f85"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.150536 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.151002 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.650983923 +0000 UTC m=+37.097231386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.151162 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.151504 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.651495547 +0000 UTC m=+37.097743010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.177701 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7"] Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.202521 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr"] Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.202843 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" event={"ID":"6f5bd51c-0b1d-4ffc-99c6-3d6e1de45370","Type":"ContainerStarted","Data":"722344eeb75110bfd3f22d93bf02a1c1547502879537fac36e3098d6db7e9459"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.204442 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" event={"ID":"f7407ab6-0811-4fc5-9969-e866b4387f88","Type":"ContainerStarted","Data":"cbf94ff201dbde870ad803fc71bfc486a7f3874a46b4f5b69432f691afa1066e"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.204493 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" event={"ID":"f7407ab6-0811-4fc5-9969-e866b4387f88","Type":"ContainerStarted","Data":"6b8f0ac307c9c3f9b0848582a99f55968be8b6f06978876f62e571bc1fd3a739"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.215771 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.216900 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" event={"ID":"63289114-7b7f-45b9-85ad-2b265f69bdee","Type":"ContainerStarted","Data":"b747fe402c785309512205ec119916d88375af3c520e2bc51a6843c5c50c8569"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.218102 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.242852 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" event={"ID":"08893f80-1a41-48d7-a510-b2610bf60cae","Type":"ContainerStarted","Data":"6d9f34fa63a04659391e44ab55604205083b788f724932acc59b05537a381002"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.255012 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.255102 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.755081156 +0000 UTC m=+37.201328619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.255289 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.256454 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.756440545 +0000 UTC m=+37.202688008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.274905 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" event={"ID":"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee","Type":"ContainerStarted","Data":"f408e8c54fdaa6c69e2be08b2749eaf4c59df923de365372d761595c26963ebe"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.304381 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.322567 4894 patch_prober.go:28] interesting pod/downloads-7954f5f757-fljtp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.322621 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fljtp" podUID="98a99c86-d227-48f6-be6f-b0cddd0221ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.336842 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" event={"ID":"2f56ee03-040d-489c-84ff-dfac0df10942","Type":"ContainerStarted","Data":"de7036b7d4b3406c0f4ea40390620e66a25c7bf377f6793705c8b64e9141607e"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.336876 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" event={"ID":"2f56ee03-040d-489c-84ff-dfac0df10942","Type":"ContainerStarted","Data":"378c69d101993371c1cddcf2349816862d4f11b76521158472c6da527479fbe2"} Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.343181 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.356348 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.357500 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.857477349 +0000 UTC m=+37.303724812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.357837 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.395567 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.396573 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" podStartSLOduration=16.396556117 podStartE2EDuration="16.396556117s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.349850829 +0000 UTC m=+36.796098292" watchObservedRunningTime="2025-06-13 04:51:18.396556117 +0000 UTC m=+36.842803580" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.445063 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j4qlc" podStartSLOduration=16.445049056 podStartE2EDuration="16.445049056s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.443527152 +0000 UTC m=+36.889774615" watchObservedRunningTime="2025-06-13 04:51:18.445049056 +0000 UTC m=+36.891296509" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.462873 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.483937 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:18.983915347 +0000 UTC m=+37.430162810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.503685 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.534795 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:18 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:18 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:18 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.534849 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.567727 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.568225 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.068202348 +0000 UTC m=+37.514449821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.580730 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" podStartSLOduration=16.580713719 podStartE2EDuration="16.580713719s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.579113453 +0000 UTC m=+37.025360916" watchObservedRunningTime="2025-06-13 04:51:18.580713719 +0000 UTC m=+37.026961182" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.669356 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.669725 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.169712156 +0000 UTC m=+37.615959619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.672569 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fljtp" podStartSLOduration=16.672545958 podStartE2EDuration="16.672545958s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.622130314 +0000 UTC m=+37.068377777" watchObservedRunningTime="2025-06-13 04:51:18.672545958 +0000 UTC m=+37.118793421" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.757042 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5tgfk" podStartSLOduration=16.757026244 podStartE2EDuration="16.757026244s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.749801936 +0000 UTC m=+37.196049399" watchObservedRunningTime="2025-06-13 04:51:18.757026244 +0000 UTC m=+37.203273697" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.781326 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.781613 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.281593803 +0000 UTC m=+37.727841256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.785200 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.886133 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.889921 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.389903967 +0000 UTC m=+37.836151430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.905713 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" podStartSLOduration=16.905695773 podStartE2EDuration="16.905695773s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:18.905172478 +0000 UTC m=+37.351419941" watchObservedRunningTime="2025-06-13 04:51:18.905695773 +0000 UTC m=+37.351943236" Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.907962 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd"] Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.989113 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.989301 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.489274284 +0000 UTC m=+37.935521747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:18 crc kubenswrapper[4894]: I0613 04:51:18.989407 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:18 crc kubenswrapper[4894]: E0613 04:51:18.989722 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.489708856 +0000 UTC m=+37.935956309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.004076 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vb2fx" podStartSLOduration=16.00405904 podStartE2EDuration="16.00405904s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.002491015 +0000 UTC m=+37.448738478" watchObservedRunningTime="2025-06-13 04:51:19.00405904 +0000 UTC m=+37.450306503" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.072413 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" podStartSLOduration=16.072394792 podStartE2EDuration="16.072394792s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.031912514 +0000 UTC m=+37.478159977" watchObservedRunningTime="2025-06-13 04:51:19.072394792 +0000 UTC m=+37.518642245" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.090173 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.090465 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.590444812 +0000 UTC m=+38.036692275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.194421 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.194842 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.694823743 +0000 UTC m=+38.141071206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.295681 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.296191 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.796173327 +0000 UTC m=+38.242420790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.332930 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qkkzl" podStartSLOduration=17.332912597 podStartE2EDuration="17.332912597s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.271283209 +0000 UTC m=+37.717530672" watchObservedRunningTime="2025-06-13 04:51:19.332912597 +0000 UTC m=+37.779160060" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.399852 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.400180 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:19.900167807 +0000 UTC m=+38.346415270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.405767 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.405944 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.414972 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" event={"ID":"63289114-7b7f-45b9-85ad-2b265f69bdee","Type":"ContainerStarted","Data":"f43eee892abe5019fdbc36351e6bc1ab5819cf5021518f94d91a0e099ed6d01c"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.466259 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" event={"ID":"58e188bf-fe34-404b-8bb3-ec0ca09e013d","Type":"ContainerStarted","Data":"ca4a81ad914b49013193ba58ff55bdd4f7d7cea44844f8e6d9abef1c28f93a8e"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.501054 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-chjqk" podStartSLOduration=16.501036507 podStartE2EDuration="16.501036507s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.38604668 +0000 UTC m=+37.832294143" watchObservedRunningTime="2025-06-13 04:51:19.501036507 +0000 UTC m=+37.947283960" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.504288 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.505180 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.005156946 +0000 UTC m=+38.451404409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.533007 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:19 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:19 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:19 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.533067 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" event={"ID":"beb34c26-69f2-4fd0-9a68-d2a2c7940e84","Type":"ContainerStarted","Data":"cb07f0b5385796e7e65228456c1278e718e7d4daf09aae9406ee959aecaacd9d"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.533078 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.554507 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmqdb" event={"ID":"7c367274-5d79-4c8e-a207-980d14875540","Type":"ContainerStarted","Data":"6c54ef12a9637386fc056c23f054ef3016c673d0fa1241a4dc1831f1450de102"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.572213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" event={"ID":"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe","Type":"ContainerStarted","Data":"c72d29e905ec6dde090ea29a6cd72888dd660eccef8315b875275f340995f051"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.575814 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" event={"ID":"cf11c387-9f91-4c1c-aaea-69f41a35d30c","Type":"ContainerStarted","Data":"abfe5dbfb1e156199ba0f6c8eabadb3f94bbb9b942a746ce14ab2c646d5b45e1"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.607922 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.608530 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.108510777 +0000 UTC m=+38.554758240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.622585 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" event={"ID":"e8208b57-8fd4-43b0-8a1f-a294cde0fcea","Type":"ContainerStarted","Data":"5210bcd476354e58f3d56c3d37edbc8345b278c9cdda360caf0f3feabbd5e75a"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.632744 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9qc97" podStartSLOduration=17.632721416 podStartE2EDuration="17.632721416s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.622325046 +0000 UTC m=+38.068572509" watchObservedRunningTime="2025-06-13 04:51:19.632721416 +0000 UTC m=+38.078968869" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.657666 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" event={"ID":"2f56ee03-040d-489c-84ff-dfac0df10942","Type":"ContainerStarted","Data":"e8b604241105d5b65fd3c710d9123f29f009c3496c4b408d89fd6de8cedfb9ce"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.659563 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" event={"ID":"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99","Type":"ContainerStarted","Data":"8316257f601297d5adfbad3f37d13b2cbffd43b84f1ee54cad73cd03832e7993"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.708758 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" event={"ID":"f4991bf6-26f7-41af-b622-8ed185db7c6a","Type":"ContainerStarted","Data":"69963af3f3b76d3c8134d4f81be6cdfc99e4f4bdf5c8b5f4d19ae978a064ac2b"} Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.711696 4894 patch_prober.go:28] interesting pod/downloads-7954f5f757-fljtp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.711808 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fljtp" podUID="98a99c86-d227-48f6-be6f-b0cddd0221ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.714299 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.715322 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.215302918 +0000 UTC m=+38.661550381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.797237 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cmqdb" podStartSLOduration=6.797221331 podStartE2EDuration="6.797221331s" podCreationTimestamp="2025-06-13 04:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.796775488 +0000 UTC m=+38.243022951" watchObservedRunningTime="2025-06-13 04:51:19.797221331 +0000 UTC m=+38.243468794" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.816884 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.821848 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.321833581 +0000 UTC m=+38.768081044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.899182 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" podStartSLOduration=16.899162942 podStartE2EDuration="16.899162942s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.870054952 +0000 UTC m=+38.316302415" watchObservedRunningTime="2025-06-13 04:51:19.899162942 +0000 UTC m=+38.345410405" Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.920855 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:19 crc kubenswrapper[4894]: E0613 04:51:19.923752 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.42372694 +0000 UTC m=+38.869974403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:19 crc kubenswrapper[4894]: I0613 04:51:19.990314 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fkxrb" podStartSLOduration=16.990300241 podStartE2EDuration="16.990300241s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:19.987723716 +0000 UTC m=+38.433971179" watchObservedRunningTime="2025-06-13 04:51:19.990300241 +0000 UTC m=+38.436547704" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.020361 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.020444 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.020487 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.022468 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.022519 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.022950 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.522937922 +0000 UTC m=+38.969185385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.031866 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c225ccc7-9659-4c3e-9256-af46e1dd1cd6-metrics-certs\") pod \"network-metrics-daemon-4dj8k\" (UID: \"c225ccc7-9659-4c3e-9256-af46e1dd1cd6\") " pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.057311 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7tj44" podStartSLOduration=18.057296813 podStartE2EDuration="18.057296813s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.054903054 +0000 UTC m=+38.501150517" watchObservedRunningTime="2025-06-13 04:51:20.057296813 +0000 UTC m=+38.503544276" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.087898 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.094451 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.123084 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.123329 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.623282327 +0000 UTC m=+39.069529790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.123629 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.124025 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.624018348 +0000 UTC m=+39.070265811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.125528 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4dj8k" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.143497 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" podStartSLOduration=19.14348166 podStartE2EDuration="19.14348166s" podCreationTimestamp="2025-06-13 04:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.141440181 +0000 UTC m=+38.587687644" watchObservedRunningTime="2025-06-13 04:51:20.14348166 +0000 UTC m=+38.589729123" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.168019 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.205270 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v28j" podStartSLOduration=17.205250831 podStartE2EDuration="17.205250831s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.203949744 +0000 UTC m=+38.650197207" watchObservedRunningTime="2025-06-13 04:51:20.205250831 +0000 UTC m=+38.651498294" Jun 13 04:51:20 crc kubenswrapper[4894]: W0613 04:51:20.207671 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod431e2331_d437_478a_aa43_c8b338746412.slice/crio-29bfbf7da8cfee2a28c1e212d82da8434e465095f37524395403cec9df4262aa WatchSource:0}: Error finding container 29bfbf7da8cfee2a28c1e212d82da8434e465095f37524395403cec9df4262aa: Status 404 returned error can't find the container with id 29bfbf7da8cfee2a28c1e212d82da8434e465095f37524395403cec9df4262aa Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.224930 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.225241 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.725211007 +0000 UTC m=+39.171458470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.262702 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hqzfn" podStartSLOduration=18.262678628 podStartE2EDuration="18.262678628s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.258347713 +0000 UTC m=+38.704595166" watchObservedRunningTime="2025-06-13 04:51:20.262678628 +0000 UTC m=+38.708926091" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.331281 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.331755 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.83174017 +0000 UTC m=+39.277987633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.447634 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.448008 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.947973343 +0000 UTC m=+39.394220806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.448144 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.448519 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:20.948511409 +0000 UTC m=+39.394758872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.452715 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2dxfq" podStartSLOduration=18.452682758999998 podStartE2EDuration="18.452682759s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.418339418 +0000 UTC m=+38.864586881" watchObservedRunningTime="2025-06-13 04:51:20.452682759 +0000 UTC m=+38.898930222" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.523545 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:20 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:20 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:20 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.523860 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.549130 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.549521 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.049503962 +0000 UTC m=+39.495751425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.555248 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwzgq"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.558102 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.599569 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-klddm"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.599625 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.601264 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g"] Jun 13 04:51:20 crc kubenswrapper[4894]: W0613 04:51:20.615239 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cc9c108_e3dd_445f_9f85_ae04a17c68ba.slice/crio-9165f784503efc71af0ddaff953622d9b147c0c4f4fc9b5f34e298ee70780117 WatchSource:0}: Error finding container 9165f784503efc71af0ddaff953622d9b147c0c4f4fc9b5f34e298ee70780117: Status 404 returned error can't find the container with id 9165f784503efc71af0ddaff953622d9b147c0c4f4fc9b5f34e298ee70780117 Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.617343 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.650374 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.650693 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.150679911 +0000 UTC m=+39.596927374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.675843 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t5g77"] Jun 13 04:51:20 crc kubenswrapper[4894]: W0613 04:51:20.690874 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc484470f_893c_413d_a9bc_74641a4611ca.slice/crio-beddd4cbefaff48332ba26a0976cd0690b07014cf35bba5afad4ffb483970862 WatchSource:0}: Error finding container beddd4cbefaff48332ba26a0976cd0690b07014cf35bba5afad4ffb483970862: Status 404 returned error can't find the container with id beddd4cbefaff48332ba26a0976cd0690b07014cf35bba5afad4ffb483970862 Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.753790 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.754234 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.254215817 +0000 UTC m=+39.700463280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.761780 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" event={"ID":"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f","Type":"ContainerStarted","Data":"142624fde98a1d9714c5c4f9be7e35df804d2ca8828a55ae0277fb862a5e9965"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.764555 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" event={"ID":"c484470f-893c-413d-a9bc-74641a4611ca","Type":"ContainerStarted","Data":"beddd4cbefaff48332ba26a0976cd0690b07014cf35bba5afad4ffb483970862"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.765399 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" event={"ID":"431e2331-d437-478a-aa43-c8b338746412","Type":"ContainerStarted","Data":"29bfbf7da8cfee2a28c1e212d82da8434e465095f37524395403cec9df4262aa"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.766902 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" event={"ID":"beb34c26-69f2-4fd0-9a68-d2a2c7940e84","Type":"ContainerStarted","Data":"9562babad18e1cad789eb4b8bf3351c17fe62c5e83efe5b6163078acd465c517"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.778094 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" event={"ID":"2cc9c108-e3dd-445f-9f85-ae04a17c68ba","Type":"ContainerStarted","Data":"9165f784503efc71af0ddaff953622d9b147c0c4f4fc9b5f34e298ee70780117"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.784564 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" event={"ID":"b63d933d-0902-413b-86d5-75c917cebade","Type":"ContainerStarted","Data":"8615012d0e7fca9c82f4009d551c6279d7d9296660ef1ed09b36b4269697e0f0"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.808531 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmqdb" event={"ID":"7c367274-5d79-4c8e-a207-980d14875540","Type":"ContainerStarted","Data":"ac1ecde97feab9b3493861b45d08adf36b7f07906388ea54b156660ffdf14b54"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.812617 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" event={"ID":"f4991bf6-26f7-41af-b622-8ed185db7c6a","Type":"ContainerStarted","Data":"e0a6604777661dc3086c4ee4ae29889b4a63fa2a21718b19c1e7deafe318dd35"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.812644 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" event={"ID":"f4991bf6-26f7-41af-b622-8ed185db7c6a","Type":"ContainerStarted","Data":"eeeb48d5d62b49f775baace2838273b08939d6f4d5791126d38c90d61350c3a3"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.823154 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" event={"ID":"b9e1405b-f6b4-470e-bc75-bbccb1eaa8fe","Type":"ContainerStarted","Data":"788eb3d01cf5afe9b0c6c4df666016daccb793c45ae2320ecd039ee6ffd32e75"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.849643 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t2ksj" podStartSLOduration=18.84962126 podStartE2EDuration="18.84962126s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.837746917 +0000 UTC m=+39.283994380" watchObservedRunningTime="2025-06-13 04:51:20.84962126 +0000 UTC m=+39.295868723" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.851058 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bmj6"] Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.855680 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.855955 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.355942472 +0000 UTC m=+39.802189925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.898418 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" event={"ID":"cf11c387-9f91-4c1c-aaea-69f41a35d30c","Type":"ContainerStarted","Data":"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.899392 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.909376 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dpxmr" podStartSLOduration=18.909356003 podStartE2EDuration="18.909356003s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.894043401 +0000 UTC m=+39.340290864" watchObservedRunningTime="2025-06-13 04:51:20.909356003 +0000 UTC m=+39.355603466" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.912795 4894 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr7lv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.912843 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.928392 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" event={"ID":"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee","Type":"ContainerStarted","Data":"1d0973e469b4f471ace791d468dd1ed9eded50a5a6f6245e7dc3d8d1fddefd87"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.940997 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" event={"ID":"08893f80-1a41-48d7-a510-b2610bf60cae","Type":"ContainerStarted","Data":"e9de771a2ebb24b9dfe0c9f23cfa7c466e8efa495498c2b2934b502edcc48d4e"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.941040 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" event={"ID":"08893f80-1a41-48d7-a510-b2610bf60cae","Type":"ContainerStarted","Data":"4d8270bca3440fe227d593b24c982c2d478e22c91d254d2d7a3702eca8216f50"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.957289 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:20 crc kubenswrapper[4894]: E0613 04:51:20.958330 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.458308625 +0000 UTC m=+39.904556088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.970521 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" event={"ID":"e8208b57-8fd4-43b0-8a1f-a294cde0fcea","Type":"ContainerStarted","Data":"b616dba4de59ad8d7981cefcab8ec239085cca64a86305781651716d788b3b11"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.972217 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dbc7" podStartSLOduration=17.972208506 podStartE2EDuration="17.972208506s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:20.97096906 +0000 UTC m=+39.417216523" watchObservedRunningTime="2025-06-13 04:51:20.972208506 +0000 UTC m=+39.418455969" Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.978797 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" event={"ID":"d0e212b9-6f8a-424e-a015-3105a00fde55","Type":"ContainerStarted","Data":"005e8617e9aa9f28466c6a605b4a98f95bfabc8e6c2539288768f00087b92653"} Jun 13 04:51:20 crc kubenswrapper[4894]: I0613 04:51:20.994271 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh"] Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.002463 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" event={"ID":"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99","Type":"ContainerStarted","Data":"becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941"} Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.003281 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.007907 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-klddm" event={"ID":"d3324c84-cea5-4046-adfe-563caa78e068","Type":"ContainerStarted","Data":"701d3979af9a1d3fa650b3a4e142e53fc688ddfdefb80d2bb9ce516e845179f1"} Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.025288 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rwrdl" podStartSLOduration=18.025272717 podStartE2EDuration="18.025272717s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:21.024948927 +0000 UTC m=+39.471196400" watchObservedRunningTime="2025-06-13 04:51:21.025272717 +0000 UTC m=+39.471520170" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.038625 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7sz2j" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.058299 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.061547 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.561530363 +0000 UTC m=+40.007777826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.065523 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l"] Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.143197 4894 patch_prober.go:28] interesting pod/apiserver-76f77b778f-tzgh5 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]log ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]etcd ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/start-apiserver-admission-initializer ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/generic-apiserver-start-informers ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/max-in-flight-filter ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/storage-object-count-tracker-hook ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jun 13 04:51:21 crc kubenswrapper[4894]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jun 13 04:51:21 crc kubenswrapper[4894]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/project.openshift.io-projectcache ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-startinformers ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/openshift.io-restmapperupdater ok Jun 13 04:51:21 crc kubenswrapper[4894]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jun 13 04:51:21 crc kubenswrapper[4894]: livez check failed Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.143253 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" podUID="ee7ab003-adaa-4207-8064-34ad105f5064" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.161085 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.162161 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.662141715 +0000 UTC m=+40.108389178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.189944 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podStartSLOduration=18.189923586 podStartE2EDuration="18.189923586s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:21.076255327 +0000 UTC m=+39.522502790" watchObservedRunningTime="2025-06-13 04:51:21.189923586 +0000 UTC m=+39.636171049" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.244536 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" podStartSLOduration=8.244514901 podStartE2EDuration="8.244514901s" podCreationTimestamp="2025-06-13 04:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:21.197208917 +0000 UTC m=+39.643456380" watchObservedRunningTime="2025-06-13 04:51:21.244514901 +0000 UTC m=+39.690762364" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.274525 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.274993 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.77497847 +0000 UTC m=+40.221225933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.373069 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rpbsd" podStartSLOduration=19.373034879 podStartE2EDuration="19.373034879s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:21.338003848 +0000 UTC m=+39.784251311" watchObservedRunningTime="2025-06-13 04:51:21.373034879 +0000 UTC m=+39.819282342" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.376156 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.376560 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.87653758 +0000 UTC m=+40.322785043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: W0613 04:51:21.471684 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6422760_85db_41eb_a616_9eca3ca624cb.slice/crio-e9baeed85b9ea64c62860f7078a29ba994355c2be47d05401a5bb824c28083c6 WatchSource:0}: Error finding container e9baeed85b9ea64c62860f7078a29ba994355c2be47d05401a5bb824c28083c6: Status 404 returned error can't find the container with id e9baeed85b9ea64c62860f7078a29ba994355c2be47d05401a5bb824c28083c6 Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.478515 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.478851 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:21.978838321 +0000 UTC m=+40.425085784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.521740 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:21 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:21 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:21 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.522014 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.543213 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:21 crc kubenswrapper[4894]: W0613 04:51:21.580137 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ce4efec5fcc7c52979635424e91cafc024c199c17583b2276ce1d333bbcd3980 WatchSource:0}: Error finding container ce4efec5fcc7c52979635424e91cafc024c199c17583b2276ce1d333bbcd3980: Status 404 returned error can't find the container with id ce4efec5fcc7c52979635424e91cafc024c199c17583b2276ce1d333bbcd3980 Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.581160 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.581246 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.081220914 +0000 UTC m=+40.527468377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.581765 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.582174 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.082165702 +0000 UTC m=+40.528413165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.683163 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.683550 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.183531836 +0000 UTC m=+40.629779299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.697068 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4dj8k"] Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.784178 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.785345 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.285332642 +0000 UTC m=+40.731580105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: W0613 04:51:21.786802 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc225ccc7_9659_4c3e_9256_af46e1dd1cd6.slice/crio-987c357c0e70256f76b4bad6f625fe43d41dba082505267a1b63e47c4b709780 WatchSource:0}: Error finding container 987c357c0e70256f76b4bad6f625fe43d41dba082505267a1b63e47c4b709780: Status 404 returned error can't find the container with id 987c357c0e70256f76b4bad6f625fe43d41dba082505267a1b63e47c4b709780 Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.887952 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.888225 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.38820702 +0000 UTC m=+40.834454483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.971114 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jun 13 04:51:21 crc kubenswrapper[4894]: I0613 04:51:21.989566 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:21 crc kubenswrapper[4894]: E0613 04:51:21.989903 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.489889923 +0000 UTC m=+40.936137386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.041554 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" event={"ID":"431e2331-d437-478a-aa43-c8b338746412","Type":"ContainerStarted","Data":"462b45682b33d1739e197d564ab4791ce9bec37a061635b17e59d724a6cc0854"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.064787 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4dj8k" event={"ID":"c225ccc7-9659-4c3e-9256-af46e1dd1cd6","Type":"ContainerStarted","Data":"987c357c0e70256f76b4bad6f625fe43d41dba082505267a1b63e47c4b709780"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.069908 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08413600755b27f3a79949a153182b6e7ba40797f6f272cc185a42595acc6208"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.081715 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-z8rxn"] Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.092678 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.093279 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.592835863 +0000 UTC m=+41.039083326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.093336 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.093889 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.593880163 +0000 UTC m=+41.040127626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.094192 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgtfp" podStartSLOduration=20.094170371 podStartE2EDuration="20.094170371s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:22.079450367 +0000 UTC m=+40.525697830" watchObservedRunningTime="2025-06-13 04:51:22.094170371 +0000 UTC m=+40.540417824" Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.095094 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" event={"ID":"c484470f-893c-413d-a9bc-74641a4611ca","Type":"ContainerStarted","Data":"8f06ec7e721ff87b96949643fbfb46e16235da5b68e93db21e9a8a069bf36384"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.119843 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" event={"ID":"dade6111-6af8-4b41-bd58-ddfe5180eefd","Type":"ContainerStarted","Data":"cc933f511d0c5ba27aee31ca86d7cffbe9eab870248de29bd920ec34e00aa384"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.136504 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"64765e640850a9ec56514b1c2d3646dbfa6400730ca9971aa071b6819ba10a59"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.152985 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" event={"ID":"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f","Type":"ContainerStarted","Data":"e66e2a1edcff6c2c3a6fdaaa406d8c5aa6cd158a4e990d6b2e55e352e7d2b6fd"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.154268 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pllr7" event={"ID":"08284aa4-ae65-47a7-940e-9f558505402a","Type":"ContainerStarted","Data":"b06d98fc4a505197d17828bec2bccedf2e531edb3c46bcf395b3f25f419fafe7"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.156256 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" event={"ID":"e6422760-85db-41eb-a616-9eca3ca624cb","Type":"ContainerStarted","Data":"e9baeed85b9ea64c62860f7078a29ba994355c2be47d05401a5bb824c28083c6"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.171990 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t5g77" event={"ID":"f5204aa1-eb4d-4300-9e08-9403f16b8c3e","Type":"ContainerStarted","Data":"6a4810b6371dcda0b0b419dd9824b0ee7169c986814ba238ad0c2795cbcc0e8f"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.172034 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t5g77" event={"ID":"f5204aa1-eb4d-4300-9e08-9403f16b8c3e","Type":"ContainerStarted","Data":"bfa2280de93296521d04d543c340d41286ff898919bd341bfbb466976d3e98ff"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.188433 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ce4efec5fcc7c52979635424e91cafc024c199c17583b2276ce1d333bbcd3980"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.201885 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.202945 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.702925089 +0000 UTC m=+41.149172552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.206767 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t5g77" podStartSLOduration=8.206749119 podStartE2EDuration="8.206749119s" podCreationTimestamp="2025-06-13 04:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:22.205089841 +0000 UTC m=+40.651337304" watchObservedRunningTime="2025-06-13 04:51:22.206749119 +0000 UTC m=+40.652996582" Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.208853 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" event={"ID":"697c82f5-f7de-4836-b29b-a11d6277a00a","Type":"ContainerStarted","Data":"eeef4f06a55fe221d68a08226796b9d7a46dd5a7c9a20fc1276722fd38cdf2b9"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.248965 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" event={"ID":"2cc9c108-e3dd-445f-9f85-ae04a17c68ba","Type":"ContainerStarted","Data":"b089ba63a7d15672e9fa94fc1398c8afc0b06e3475ec5cba7a8fe5d3e8af0779"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.274840 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" event={"ID":"b63d933d-0902-413b-86d5-75c917cebade","Type":"ContainerStarted","Data":"97aec3bf2829d71fab8e63f5fc46947b2bc1da8f29392f7adc707bf8ad3ec4ec"} Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.303849 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.313585 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.813563579 +0000 UTC m=+41.259811042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.316264 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sg69n" podStartSLOduration=19.316229776 podStartE2EDuration="19.316229776s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:22.297554777 +0000 UTC m=+40.743802240" watchObservedRunningTime="2025-06-13 04:51:22.316229776 +0000 UTC m=+40.762477239" Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.318802 4894 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr7lv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.318874 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.408553 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.408879 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k2zcl" podStartSLOduration=20.408859788 podStartE2EDuration="20.408859788s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:22.340877457 +0000 UTC m=+40.787124920" watchObservedRunningTime="2025-06-13 04:51:22.408859788 +0000 UTC m=+40.855107251" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.410176 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:22.910155906 +0000 UTC m=+41.356403359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.512851 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.513278 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.01326198 +0000 UTC m=+41.459509443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.524168 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:22 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:22 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:22 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.524230 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.614195 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.614388 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.114354206 +0000 UTC m=+41.560601669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.628905 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.629502 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.129484243 +0000 UTC m=+41.575731706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.730613 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.731020 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.230999061 +0000 UTC m=+41.677246524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.731274 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.731601 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.231588208 +0000 UTC m=+41.677835671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.832843 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.833288 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.333254501 +0000 UTC m=+41.779501964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.833563 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.833945 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.333935851 +0000 UTC m=+41.780183314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.940109 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.940314 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.440278268 +0000 UTC m=+41.886525731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:22 crc kubenswrapper[4894]: I0613 04:51:22.940488 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:22 crc kubenswrapper[4894]: E0613 04:51:22.940859 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.440851435 +0000 UTC m=+41.887098898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.041441 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.041673 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.541624912 +0000 UTC m=+41.987872375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.042164 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.042580 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.542569059 +0000 UTC m=+41.988816512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.143702 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.143892 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.643857831 +0000 UTC m=+42.090105294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.143966 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.144360 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.644345775 +0000 UTC m=+42.090593238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.246787 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.265407 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.765366066 +0000 UTC m=+42.211613529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.296241 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" event={"ID":"c484470f-893c-413d-a9bc-74641a4611ca","Type":"ContainerStarted","Data":"ef0dc9f6f46563f0ee5111494b76541ccc337d9316ee1521669602fe4b336b14"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.297674 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.299376 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" event={"ID":"dade6111-6af8-4b41-bd58-ddfe5180eefd","Type":"ContainerStarted","Data":"2a4b24ea6449f7552b4e378b3922af17aff24d114b4596d75aa5928d4ec306a8"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.299455 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.303761 4894 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-c629l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.303795 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" podUID="dade6111-6af8-4b41-bd58-ddfe5180eefd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.309200 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" event={"ID":"e6422760-85db-41eb-a616-9eca3ca624cb","Type":"ContainerStarted","Data":"29e455b94219369402c2d4dd59f4706723dcb6e924462ca20ecd35238777956f"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.342236 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" event={"ID":"d0e212b9-6f8a-424e-a015-3105a00fde55","Type":"ContainerStarted","Data":"2027672f7ba0d0ad4ca445e28e557e691538abe61fea87468e24e7fed5777b98"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.343532 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"da4b008f53a15efc82247a6c2c2689e680e67895436b6adb381531a433951a20"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.345714 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"94b506e5f6ee7871b3442c9103cc5f2038773cdad29724369e9d7eecf2f77a67"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.346150 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.348949 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.349258 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.849246766 +0000 UTC m=+42.295494229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.357360 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pllr7" event={"ID":"08284aa4-ae65-47a7-940e-9f558505402a","Type":"ContainerStarted","Data":"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.359808 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-klddm" event={"ID":"d3324c84-cea5-4046-adfe-563caa78e068","Type":"ContainerStarted","Data":"99b852518cf42aef71cd12b4611d7d664acc93bbfb4370ae431af951ad31124a"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.359847 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-klddm" event={"ID":"d3324c84-cea5-4046-adfe-563caa78e068","Type":"ContainerStarted","Data":"03582b920ddf6f4819c5132a97a774880c266f99c9432c59b6348ad853c361f2"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.360041 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-klddm" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.366472 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4dj8k" event={"ID":"c225ccc7-9659-4c3e-9256-af46e1dd1cd6","Type":"ContainerStarted","Data":"72b225fd9bb752ef6c0919809f06a3652a703259646fc4ef7898d5577d2d9427"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.378005 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" event={"ID":"697c82f5-f7de-4836-b29b-a11d6277a00a","Type":"ContainerStarted","Data":"f40dd291527653e357b5706a449a31ed024f4713bae430b55444ef456e338e7d"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.379055 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.381155 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" podStartSLOduration=20.381143876 podStartE2EDuration="20.381143876s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.320753564 +0000 UTC m=+41.767001027" watchObservedRunningTime="2025-06-13 04:51:23.381143876 +0000 UTC m=+41.827391339" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.399115 4894 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-w4dzh container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.399158 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" podUID="697c82f5-f7de-4836-b29b-a11d6277a00a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.418848 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" podStartSLOduration=20.418827553 podStartE2EDuration="20.418827553s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.385052029 +0000 UTC m=+41.831299492" watchObservedRunningTime="2025-06-13 04:51:23.418827553 +0000 UTC m=+41.865075016" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.419975 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" event={"ID":"9de8fcfc-9d3f-4aeb-be3d-089fe4e78d2f","Type":"ContainerStarted","Data":"0fea50332ed09467f37aca0e6e61be6977a5314823b22ef596f6fbb71cd56a46"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.421574 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6bmj6" podStartSLOduration=20.421567742 podStartE2EDuration="20.421567742s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.414191859 +0000 UTC m=+41.860439322" watchObservedRunningTime="2025-06-13 04:51:23.421567742 +0000 UTC m=+41.867815205" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.436632 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7b3cda0d58993f7e9c1a9e8f24abb157dfdc298536f2ccfd07dc5d841078fc2d"} Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.437514 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" gracePeriod=30 Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.439349 4894 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr7lv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.439386 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.446631 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-klddm" podStartSLOduration=9.446617035 podStartE2EDuration="9.446617035s" podCreationTimestamp="2025-06-13 04:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.4436841 +0000 UTC m=+41.889931563" watchObservedRunningTime="2025-06-13 04:51:23.446617035 +0000 UTC m=+41.892864498" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.450421 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.451906 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:23.951890547 +0000 UTC m=+42.398138000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.496943 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" podStartSLOduration=20.496908735 podStartE2EDuration="20.496908735s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.466937091 +0000 UTC m=+41.913184564" watchObservedRunningTime="2025-06-13 04:51:23.496908735 +0000 UTC m=+41.943156198" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.500335 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pllr7" podStartSLOduration=21.500317154 podStartE2EDuration="21.500317154s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.492856038 +0000 UTC m=+41.939103501" watchObservedRunningTime="2025-06-13 04:51:23.500317154 +0000 UTC m=+41.946564617" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.521585 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:23 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:23 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:23 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.521699 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.552034 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.552573 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.05253868 +0000 UTC m=+42.498786143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.654290 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.654693 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.154674196 +0000 UTC m=+42.600921659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.699031 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwzgq" podStartSLOduration=20.699004625 podStartE2EDuration="20.699004625s" podCreationTimestamp="2025-06-13 04:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:23.693241689 +0000 UTC m=+42.139489152" watchObservedRunningTime="2025-06-13 04:51:23.699004625 +0000 UTC m=+42.145252078" Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.755618 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.756043 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.25602549 +0000 UTC m=+42.702272943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.856680 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.856895 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.356863169 +0000 UTC m=+42.803110632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.857542 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.857967 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.35794957 +0000 UTC m=+42.804197033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.958385 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.958644 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.458605964 +0000 UTC m=+42.904853427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:23 crc kubenswrapper[4894]: I0613 04:51:23.958748 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:23 crc kubenswrapper[4894]: E0613 04:51:23.959052 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.459036486 +0000 UTC m=+42.905283959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.060581 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.061021 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.561001178 +0000 UTC m=+43.007248641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.162354 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.163365 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.66334773 +0000 UTC m=+43.109595193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.264327 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.264731 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.764713374 +0000 UTC m=+43.210960837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.365530 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.366006 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.865991086 +0000 UTC m=+43.312238539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.409559 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.415463 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tzgh5" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.445208 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" event={"ID":"d0e212b9-6f8a-424e-a015-3105a00fde55","Type":"ContainerStarted","Data":"f960f7d357c505a757f24d4132d1e457ee04562ada9aa987befc55d6f117b4bd"} Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.447614 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" event={"ID":"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee","Type":"ContainerStarted","Data":"e78057cbc96550ca31c699cb4575e82a8e4df9fb8a0267de4617b4c791426dff"} Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.447676 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" event={"ID":"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee","Type":"ContainerStarted","Data":"21a04a2c7e84a67bf04518aa3231afcfa650c5a42940959229155d914fb33401"} Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.451954 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4dj8k" event={"ID":"c225ccc7-9659-4c3e-9256-af46e1dd1cd6","Type":"ContainerStarted","Data":"bc809f2b2d811cbba7dd1c459755eeb33e419a26aec0b59f37b22c0759bff0b9"} Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.466406 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.466835 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:24.966818044 +0000 UTC m=+43.413065507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.483140 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c629l" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.519582 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-w4dzh" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.526196 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:24 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:24 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:24 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.526273 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.529698 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4dj8k" podStartSLOduration=22.529641677 podStartE2EDuration="22.529641677s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:24.526916458 +0000 UTC m=+42.973163921" watchObservedRunningTime="2025-06-13 04:51:24.529641677 +0000 UTC m=+42.975889140" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.568892 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.571429 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.071408472 +0000 UTC m=+43.517655935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.640615 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbbt2" podStartSLOduration=22.640596638 podStartE2EDuration="22.640596638s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:24.605602078 +0000 UTC m=+43.051849541" watchObservedRunningTime="2025-06-13 04:51:24.640596638 +0000 UTC m=+43.086844091" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.669912 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.670072 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.170042847 +0000 UTC m=+43.616290310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.670593 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.670952 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.170938333 +0000 UTC m=+43.617185796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.771453 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.771682 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.271619767 +0000 UTC m=+43.717867230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.771762 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.772121 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.272110501 +0000 UTC m=+43.718357964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.872567 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.872795 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.372764435 +0000 UTC m=+43.819011898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.872927 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.873223 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.373210608 +0000 UTC m=+43.819458061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.961098 4894 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.973780 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.973959 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.473930503 +0000 UTC m=+43.920177966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:24 crc kubenswrapper[4894]: I0613 04:51:24.974064 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:24 crc kubenswrapper[4894]: E0613 04:51:24.974384 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.474376836 +0000 UTC m=+43.920624299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.075080 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:25 crc kubenswrapper[4894]: E0613 04:51:25.075494 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.575475202 +0000 UTC m=+44.021722665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.176615 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: E0613 04:51:25.177016 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.676998551 +0000 UTC m=+44.123246004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.277563 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:25 crc kubenswrapper[4894]: E0613 04:51:25.277791 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.777762988 +0000 UTC m=+44.224010441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.379511 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: E0613 04:51:25.379931 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-06-13 04:51:25.879919335 +0000 UTC m=+44.326166798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggqkw" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.381719 4894 patch_prober.go:28] interesting pod/downloads-7954f5f757-fljtp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.381757 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fljtp" podUID="98a99c86-d227-48f6-be6f-b0cddd0221ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.381955 4894 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-06-13T04:51:24.961124704Z","Handler":null,"Name":""} Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.382867 4894 patch_prober.go:28] interesting pod/downloads-7954f5f757-fljtp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.382888 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fljtp" podUID="98a99c86-d227-48f6-be6f-b0cddd0221ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.393107 4894 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.393141 4894 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.457283 4894 generic.go:334] "Generic (PLEG): container finished" podID="58e188bf-fe34-404b-8bb3-ec0ca09e013d" containerID="ca4a81ad914b49013193ba58ff55bdd4f7d7cea44844f8e6d9abef1c28f93a8e" exitCode=0 Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.457344 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" event={"ID":"58e188bf-fe34-404b-8bb3-ec0ca09e013d","Type":"ContainerDied","Data":"ca4a81ad914b49013193ba58ff55bdd4f7d7cea44844f8e6d9abef1c28f93a8e"} Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.460337 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" event={"ID":"7dbc6fdc-ec5e-4aec-bfb8-8059b8382cee","Type":"ContainerStarted","Data":"27dfbe85772f830319dd1429ce2b7d1a05d8028f1442297cb600faf4bc6bbd7f"} Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.484404 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.505575 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ns4bb" podStartSLOduration=11.505560439 podStartE2EDuration="11.505560439s" podCreationTimestamp="2025-06-13 04:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:25.504320563 +0000 UTC m=+43.950568016" watchObservedRunningTime="2025-06-13 04:51:25.505560439 +0000 UTC m=+43.951807902" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.521935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.524240 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:25 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:25 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:25 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.524320 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.581276 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.582523 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.585586 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.586065 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.592094 4894 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.592148 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.595750 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.642631 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggqkw\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.687926 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.688060 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.688225 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27trx\" (UniqueName: \"kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.782043 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.783216 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.785378 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.791054 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.791142 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.791207 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27trx\" (UniqueName: \"kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.791829 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.791854 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.801775 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.832898 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.837074 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27trx\" (UniqueName: \"kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx\") pod \"certified-operators-47h94\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.883026 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.883779 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.885803 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.886939 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.892946 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.892995 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.893051 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.901076 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:51:25 crc kubenswrapper[4894]: I0613 04:51:25.912230 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.993690 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.993732 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.993759 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.993783 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.993836 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.994245 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:25.997363 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.002437 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.003344 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.013488 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h\") pod \"community-operators-rd5qz\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.024915 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.059189 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.059324 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.095964 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.096012 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42hl\" (UniqueName: \"kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.096046 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.096066 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.096122 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.096479 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.101251 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.105846 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.125497 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.218437 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.218515 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42hl\" (UniqueName: \"kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.218574 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.219166 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.196886 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.221262 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.221386 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.222760 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.225641 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.255497 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42hl\" (UniqueName: \"kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl\") pod \"certified-operators-vqcmw\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.289419 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.314164 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.319319 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r47px\" (UniqueName: \"kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.319395 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.319430 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: W0613 04:51:26.321375 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7074322_a56f_4380_bf71_2ae9d44e9bc8.slice/crio-a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1 WatchSource:0}: Error finding container a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1: Status 404 returned error can't find the container with id a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1 Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.328588 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.420274 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.420614 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.420771 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r47px\" (UniqueName: \"kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.421756 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.422391 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.428626 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.461546 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r47px\" (UniqueName: \"kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px\") pod \"community-operators-nqxnn\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.472700 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerStarted","Data":"004e2cf2b94724be5023d6e4932361242e660074510338af261e2b960f274182"} Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.473801 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" event={"ID":"e7074322-a56f-4380-bf71-2ae9d44e9bc8","Type":"ContainerStarted","Data":"a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1"} Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.504979 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.519326 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.521883 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:26 crc kubenswrapper[4894]: [-]has-synced failed: reason withheld Jun 13 04:51:26 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:26 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.521936 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.547789 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:51:26 crc kubenswrapper[4894]: W0613 04:51:26.565205 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88e73a7_e3d0_4015_9f77_748ea17f6e39.slice/crio-93ded20b072bcfc0207a34af38b9c8e54a3e073dbbf84d9790e2baa22b612315 WatchSource:0}: Error finding container 93ded20b072bcfc0207a34af38b9c8e54a3e073dbbf84d9790e2baa22b612315: Status 404 returned error can't find the container with id 93ded20b072bcfc0207a34af38b9c8e54a3e073dbbf84d9790e2baa22b612315 Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.575264 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.661562 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:51:26 crc kubenswrapper[4894]: W0613 04:51:26.689401 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd0e39e_cf29_4fa4_9778_84bb3d255d82.slice/crio-2e1a6462f6b663cb0711276d627d0e2d329cfb7a84c4cf3842f58d375ba8e850 WatchSource:0}: Error finding container 2e1a6462f6b663cb0711276d627d0e2d329cfb7a84c4cf3842f58d375ba8e850: Status 404 returned error can't find the container with id 2e1a6462f6b663cb0711276d627d0e2d329cfb7a84c4cf3842f58d375ba8e850 Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.833198 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:26 crc kubenswrapper[4894]: I0613 04:51:26.970219 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:51:26 crc kubenswrapper[4894]: W0613 04:51:26.979738 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6291b0c_010c_4cf4_8df4_9007de0e4f35.slice/crio-394e818ce3836ddf7b1432dfbcb02052176c94bc13e4bf44db42ef1a17ab3553 WatchSource:0}: Error finding container 394e818ce3836ddf7b1432dfbcb02052176c94bc13e4bf44db42ef1a17ab3553: Status 404 returned error can't find the container with id 394e818ce3836ddf7b1432dfbcb02052176c94bc13e4bf44db42ef1a17ab3553 Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.033146 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume\") pod \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.033209 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume\") pod \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.033287 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcvxl\" (UniqueName: \"kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl\") pod \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\" (UID: \"58e188bf-fe34-404b-8bb3-ec0ca09e013d\") " Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.034189 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume" (OuterVolumeSpecName: "config-volume") pod "58e188bf-fe34-404b-8bb3-ec0ca09e013d" (UID: "58e188bf-fe34-404b-8bb3-ec0ca09e013d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.038754 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl" (OuterVolumeSpecName: "kube-api-access-zcvxl") pod "58e188bf-fe34-404b-8bb3-ec0ca09e013d" (UID: "58e188bf-fe34-404b-8bb3-ec0ca09e013d"). InnerVolumeSpecName "kube-api-access-zcvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.038986 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "58e188bf-fe34-404b-8bb3-ec0ca09e013d" (UID: "58e188bf-fe34-404b-8bb3-ec0ca09e013d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.135283 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58e188bf-fe34-404b-8bb3-ec0ca09e013d-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.135322 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58e188bf-fe34-404b-8bb3-ec0ca09e013d-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.135332 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcvxl\" (UniqueName: \"kubernetes.io/projected/58e188bf-fe34-404b-8bb3-ec0ca09e013d-kube-api-access-zcvxl\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.174440 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:51:27 crc kubenswrapper[4894]: E0613 04:51:27.302533 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:27 crc kubenswrapper[4894]: E0613 04:51:27.304736 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:27 crc kubenswrapper[4894]: E0613 04:51:27.306443 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:27 crc kubenswrapper[4894]: E0613 04:51:27.306485 4894 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.495448 4894 generic.go:334] "Generic (PLEG): container finished" podID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerID="476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79" exitCode=0 Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.495574 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerDied","Data":"476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.495606 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerStarted","Data":"394e818ce3836ddf7b1432dfbcb02052176c94bc13e4bf44db42ef1a17ab3553"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.500025 4894 generic.go:334] "Generic (PLEG): container finished" podID="78966571-5be7-4b00-a993-39b3158ee935" containerID="6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92" exitCode=0 Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.500098 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerDied","Data":"6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.502253 4894 generic.go:334] "Generic (PLEG): container finished" podID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerID="5bb4bef9d030e5c1c3cd6b528d931e48be19a628f67735e497666ef1eb6f0571" exitCode=0 Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.502545 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerDied","Data":"5bb4bef9d030e5c1c3cd6b528d931e48be19a628f67735e497666ef1eb6f0571"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.502626 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerStarted","Data":"93ded20b072bcfc0207a34af38b9c8e54a3e073dbbf84d9790e2baa22b612315"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.504200 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerID="61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f" exitCode=0 Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.504290 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerDied","Data":"61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.504331 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerStarted","Data":"2e1a6462f6b663cb0711276d627d0e2d329cfb7a84c4cf3842f58d375ba8e850"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.504923 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.509387 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.509419 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.511431 4894 patch_prober.go:28] interesting pod/console-f9d7485db-pllr7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.511471 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pllr7" podUID="08284aa4-ae65-47a7-940e-9f558505402a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.511861 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ebc74e0-474a-4a41-9066-8883d93711f3","Type":"ContainerStarted","Data":"a002ba5044afaa89f406144f483051be6b555ff70ebe34a3f5db58da6e5611ca"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.511885 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ebc74e0-474a-4a41-9066-8883d93711f3","Type":"ContainerStarted","Data":"96bfadbe761ea2049697a26d20c072201d318e9ef007c8476042b340b1b096b8"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.524231 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" event={"ID":"e7074322-a56f-4380-bf71-2ae9d44e9bc8","Type":"ContainerStarted","Data":"9586a0b1f3d3ab149f69997076b6fa5eff18460f30c2d608923e600903e396b9"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.524751 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.529027 4894 patch_prober.go:28] interesting pod/router-default-5444994796-5tgfk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jun 13 04:51:27 crc kubenswrapper[4894]: [+]has-synced ok Jun 13 04:51:27 crc kubenswrapper[4894]: [+]process-running ok Jun 13 04:51:27 crc kubenswrapper[4894]: healthz check failed Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.529107 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5tgfk" podUID="c4bd4fd2-c14d-42f7-819c-84bb722484d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.530860 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" event={"ID":"58e188bf-fe34-404b-8bb3-ec0ca09e013d","Type":"ContainerDied","Data":"f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da"} Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.530953 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0f7f2fc80d8ec4adbb0a2975c361bbae5820fb690cfe00209b595a32acc95da" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.531153 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.608458 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.6084271 podStartE2EDuration="2.6084271s" podCreationTimestamp="2025-06-13 04:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:27.58588696 +0000 UTC m=+46.032134423" watchObservedRunningTime="2025-06-13 04:51:27.6084271 +0000 UTC m=+46.054674573" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.614097 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" podStartSLOduration=25.614067313 podStartE2EDuration="25.614067313s" podCreationTimestamp="2025-06-13 04:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:27.611638733 +0000 UTC m=+46.057886216" watchObservedRunningTime="2025-06-13 04:51:27.614067313 +0000 UTC m=+46.060314776" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.776781 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:51:27 crc kubenswrapper[4894]: E0613 04:51:27.777075 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e188bf-fe34-404b-8bb3-ec0ca09e013d" containerName="collect-profiles" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.777098 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e188bf-fe34-404b-8bb3-ec0ca09e013d" containerName="collect-profiles" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.777209 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e188bf-fe34-404b-8bb3-ec0ca09e013d" containerName="collect-profiles" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.778117 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.793188 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.807982 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.871470 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.872608 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.878024 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.878295 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.900290 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.958748 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.958800 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.958827 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.958886 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q87r\" (UniqueName: \"kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:27 crc kubenswrapper[4894]: I0613 04:51:27.959098 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061260 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061327 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061371 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061446 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q87r\" (UniqueName: \"kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061466 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.061986 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.062507 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.062823 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.079443 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.094970 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q87r\" (UniqueName: \"kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r\") pod \"redhat-marketplace-r6hfb\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.103013 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.182798 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.184000 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.193108 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.199936 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.365236 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.365683 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.365758 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5hx\" (UniqueName: \"kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.433627 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.470383 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.470457 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.470505 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5hx\" (UniqueName: \"kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.471306 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.471582 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.505787 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5hx\" (UniqueName: \"kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx\") pod \"redhat-marketplace-xd5m5\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.521035 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.524863 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5tgfk" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.587863 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.588132 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4efc2ace-7464-4a68-a22a-046404032bd1","Type":"ContainerStarted","Data":"536bf24e08dbb5ef9f070a73fcf66debd240721e560805042f6a61a1107837be"} Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.590411 4894 generic.go:334] "Generic (PLEG): container finished" podID="0ebc74e0-474a-4a41-9066-8883d93711f3" containerID="a002ba5044afaa89f406144f483051be6b555ff70ebe34a3f5db58da6e5611ca" exitCode=0 Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.590668 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ebc74e0-474a-4a41-9066-8883d93711f3","Type":"ContainerDied","Data":"a002ba5044afaa89f406144f483051be6b555ff70ebe34a3f5db58da6e5611ca"} Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.792462 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.793773 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.794678 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.795882 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.796964 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.877545 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d65dc\" (UniqueName: \"kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.877630 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.877679 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.979234 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d65dc\" (UniqueName: \"kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.979802 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.979828 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.980639 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:28 crc kubenswrapper[4894]: I0613 04:51:28.980751 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.012837 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d65dc\" (UniqueName: \"kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc\") pod \"redhat-operators-r8bkp\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.129399 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.159322 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.182010 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.185298 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.205588 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.283577 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.283640 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cpc\" (UniqueName: \"kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.283704 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.386519 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.387275 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cpc\" (UniqueName: \"kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.387439 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.388328 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.389177 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.418741 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cpc\" (UniqueName: \"kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc\") pod \"redhat-operators-h8tmc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.475331 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.516692 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.622513 4894 generic.go:334] "Generic (PLEG): container finished" podID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerID="2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319" exitCode=0 Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.622574 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerDied","Data":"2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.622712 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerStarted","Data":"3a28ca645993d4fd867678ac809c92e28eba1ebd4a2933bdba4c686eeaff738a"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.636356 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerStarted","Data":"cf1f540c955b17d1dcd291376c2ef56eb7c4934b809d117ab14250108b3b0f4b"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.661150 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4efc2ace-7464-4a68-a22a-046404032bd1","Type":"ContainerStarted","Data":"f682608f33b50c800928ce17da48948d82fb7833c86bf8189107076595269eda"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.671238 4894 generic.go:334] "Generic (PLEG): container finished" podID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerID="ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb" exitCode=0 Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.671818 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerDied","Data":"ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.671844 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerStarted","Data":"643ef98bd0d7da569f989a4b2c4f3177eae9b99dc291533ea36b6861d1da385f"} Jun 13 04:51:29 crc kubenswrapper[4894]: I0613 04:51:29.779298 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.7792778719999998 podStartE2EDuration="2.779277872s" podCreationTimestamp="2025-06-13 04:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:29.678043941 +0000 UTC m=+48.124291404" watchObservedRunningTime="2025-06-13 04:51:29.779277872 +0000 UTC m=+48.225525335" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.083208 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.093170 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.213201 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access\") pod \"0ebc74e0-474a-4a41-9066-8883d93711f3\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.213565 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir\") pod \"0ebc74e0-474a-4a41-9066-8883d93711f3\" (UID: \"0ebc74e0-474a-4a41-9066-8883d93711f3\") " Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.213780 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0ebc74e0-474a-4a41-9066-8883d93711f3" (UID: "0ebc74e0-474a-4a41-9066-8883d93711f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.214103 4894 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebc74e0-474a-4a41-9066-8883d93711f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.237300 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0ebc74e0-474a-4a41-9066-8883d93711f3" (UID: "0ebc74e0-474a-4a41-9066-8883d93711f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.316057 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ebc74e0-474a-4a41-9066-8883d93711f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.696581 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.698151 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ebc74e0-474a-4a41-9066-8883d93711f3","Type":"ContainerDied","Data":"96bfadbe761ea2049697a26d20c072201d318e9ef007c8476042b340b1b096b8"} Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.698239 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96bfadbe761ea2049697a26d20c072201d318e9ef007c8476042b340b1b096b8" Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.704579 4894 generic.go:334] "Generic (PLEG): container finished" podID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerID="a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0" exitCode=0 Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.704647 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerDied","Data":"a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0"} Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.704731 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerStarted","Data":"808fb35e58aa0f06019e9c1feafe55d81ceb5ef8d39cbdf5d68af2e07029ec95"} Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.717413 4894 generic.go:334] "Generic (PLEG): container finished" podID="318542e4-4246-4790-b652-4a173323ec4f" containerID="37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6" exitCode=0 Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.717508 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerDied","Data":"37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6"} Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.723928 4894 generic.go:334] "Generic (PLEG): container finished" podID="4efc2ace-7464-4a68-a22a-046404032bd1" containerID="f682608f33b50c800928ce17da48948d82fb7833c86bf8189107076595269eda" exitCode=0 Jun 13 04:51:30 crc kubenswrapper[4894]: I0613 04:51:30.723981 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4efc2ace-7464-4a68-a22a-046404032bd1","Type":"ContainerDied","Data":"f682608f33b50c800928ce17da48948d82fb7833c86bf8189107076595269eda"} Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.237493 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.304132 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-klddm" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.375422 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir\") pod \"4efc2ace-7464-4a68-a22a-046404032bd1\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.375487 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access\") pod \"4efc2ace-7464-4a68-a22a-046404032bd1\" (UID: \"4efc2ace-7464-4a68-a22a-046404032bd1\") " Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.375609 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4efc2ace-7464-4a68-a22a-046404032bd1" (UID: "4efc2ace-7464-4a68-a22a-046404032bd1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.375958 4894 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efc2ace-7464-4a68-a22a-046404032bd1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.394720 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4efc2ace-7464-4a68-a22a-046404032bd1" (UID: "4efc2ace-7464-4a68-a22a-046404032bd1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.477181 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efc2ace-7464-4a68-a22a-046404032bd1-kube-api-access\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.770878 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4efc2ace-7464-4a68-a22a-046404032bd1","Type":"ContainerDied","Data":"536bf24e08dbb5ef9f070a73fcf66debd240721e560805042f6a61a1107837be"} Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.771172 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="536bf24e08dbb5ef9f070a73fcf66debd240721e560805042f6a61a1107837be" Jun 13 04:51:32 crc kubenswrapper[4894]: I0613 04:51:32.771013 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jun 13 04:51:35 crc kubenswrapper[4894]: I0613 04:51:35.387037 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fljtp" Jun 13 04:51:37 crc kubenswrapper[4894]: I0613 04:51:37.222476 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jun 13 04:51:37 crc kubenswrapper[4894]: I0613 04:51:37.237553 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jun 13 04:51:37 crc kubenswrapper[4894]: E0613 04:51:37.309209 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:37 crc kubenswrapper[4894]: E0613 04:51:37.314554 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:37 crc kubenswrapper[4894]: E0613 04:51:37.317639 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:37 crc kubenswrapper[4894]: E0613 04:51:37.317699 4894 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:51:37 crc kubenswrapper[4894]: I0613 04:51:37.513301 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:37 crc kubenswrapper[4894]: I0613 04:51:37.516688 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 04:51:37 crc kubenswrapper[4894]: I0613 04:51:37.538538 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.538517112 podStartE2EDuration="538.517112ms" podCreationTimestamp="2025-06-13 04:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:51:37.529343017 +0000 UTC m=+55.975590490" watchObservedRunningTime="2025-06-13 04:51:37.538517112 +0000 UTC m=+55.984764575" Jun 13 04:51:45 crc kubenswrapper[4894]: I0613 04:51:45.837934 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:51:47 crc kubenswrapper[4894]: E0613 04:51:47.301254 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:47 crc kubenswrapper[4894]: E0613 04:51:47.303982 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:47 crc kubenswrapper[4894]: E0613 04:51:47.305368 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" cmd=["/bin/bash","-c","test -f /ready/ready"] Jun 13 04:51:47 crc kubenswrapper[4894]: E0613 04:51:47.305404 4894 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:51:53 crc kubenswrapper[4894]: E0613 04:51:53.198411 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jun 13 04:51:53 crc kubenswrapper[4894]: E0613 04:51:53.199278 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27trx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-47h94_openshift-marketplace(78966571-5be7-4b00-a993-39b3158ee935): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jun 13 04:51:53 crc kubenswrapper[4894]: E0613 04:51:53.200843 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-47h94" podUID="78966571-5be7-4b00-a993-39b3158ee935" Jun 13 04:51:53 crc kubenswrapper[4894]: I0613 04:51:53.915681 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-z8rxn_a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99/kube-multus-additional-cni-plugins/0.log" Jun 13 04:51:53 crc kubenswrapper[4894]: I0613 04:51:53.915738 4894 generic.go:334] "Generic (PLEG): container finished" podID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" exitCode=137 Jun 13 04:51:53 crc kubenswrapper[4894]: I0613 04:51:53.915829 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" event={"ID":"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99","Type":"ContainerDied","Data":"becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941"} Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.283781 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-47h94" podUID="78966571-5be7-4b00-a993-39b3158ee935" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.369413 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-z8rxn_a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99/kube-multus-additional-cni-plugins/0.log" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.369681 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.377347 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.377505 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mx5hx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xd5m5_openshift-marketplace(d20f42b3-f716-4c1e-8720-bf1f6e43fc97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.378799 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xd5m5" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.402021 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.402155 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8q87r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-r6hfb_openshift-marketplace(0cbd7e4c-94e5-49ed-9c93-674f70506056): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.403809 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-r6hfb" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.534816 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtjk8\" (UniqueName: \"kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8\") pod \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.535233 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready\") pod \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.535321 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir\") pod \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.535358 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist\") pod \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\" (UID: \"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99\") " Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.535528 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" (UID: "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.535882 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready" (OuterVolumeSpecName: "ready") pod "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" (UID: "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.536062 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" (UID: "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.540098 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8" (OuterVolumeSpecName: "kube-api-access-wtjk8") pod "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" (UID: "a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99"). InnerVolumeSpecName "kube-api-access-wtjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.637241 4894 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.637286 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtjk8\" (UniqueName: \"kubernetes.io/projected/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-kube-api-access-wtjk8\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.637297 4894 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-ready\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.637308 4894 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.923419 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerStarted","Data":"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de"} Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.927007 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-z8rxn_a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99/kube-multus-additional-cni-plugins/0.log" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.927168 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.927193 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-z8rxn" event={"ID":"a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99","Type":"ContainerDied","Data":"8316257f601297d5adfbad3f37d13b2cbffd43b84f1ee54cad73cd03832e7993"} Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.927302 4894 scope.go:117] "RemoveContainer" containerID="becad6c006c8334687ddc14b7a550c39e870e4be3db8f73762d5ae7474c76941" Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.934048 4894 generic.go:334] "Generic (PLEG): container finished" podID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerID="d850497847391f8a8018f0089dcae8781d46543ea3e4d2b5f91526ce76a856cf" exitCode=0 Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.934147 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerDied","Data":"d850497847391f8a8018f0089dcae8781d46543ea3e4d2b5f91526ce76a856cf"} Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.940337 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerStarted","Data":"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae"} Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.949253 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerID="ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556" exitCode=0 Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.949329 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerDied","Data":"ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556"} Jun 13 04:51:54 crc kubenswrapper[4894]: I0613 04:51:54.956929 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerStarted","Data":"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d"} Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.958777 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-r6hfb" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" Jun 13 04:51:54 crc kubenswrapper[4894]: E0613 04:51:54.959133 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xd5m5" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.077893 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-z8rxn"] Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.080296 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-z8rxn"] Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.964768 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerStarted","Data":"cc2282362d218f6ad76c98b18adb461fccf1c34545e3a2ecdc1e27c4c4bc06e1"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.966868 4894 generic.go:334] "Generic (PLEG): container finished" podID="318542e4-4246-4790-b652-4a173323ec4f" containerID="ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae" exitCode=0 Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.966932 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerDied","Data":"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.969047 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerStarted","Data":"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.975432 4894 generic.go:334] "Generic (PLEG): container finished" podID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerID="853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d" exitCode=0 Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.975496 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerDied","Data":"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.975582 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerStarted","Data":"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.977085 4894 generic.go:334] "Generic (PLEG): container finished" podID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerID="43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de" exitCode=0 Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.977122 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerDied","Data":"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de"} Jun 13 04:51:55 crc kubenswrapper[4894]: I0613 04:51:55.988639 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rd5qz" podStartSLOduration=3.0952826 podStartE2EDuration="30.988619494s" podCreationTimestamp="2025-06-13 04:51:25 +0000 UTC" firstStartedPulling="2025-06-13 04:51:27.504251065 +0000 UTC m=+45.950498568" lastFinishedPulling="2025-06-13 04:51:55.397587999 +0000 UTC m=+73.843835462" observedRunningTime="2025-06-13 04:51:55.984647888 +0000 UTC m=+74.430895351" watchObservedRunningTime="2025-06-13 04:51:55.988619494 +0000 UTC m=+74.434866967" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.011149 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h8tmc" podStartSLOduration=2.238763942 podStartE2EDuration="27.011128381s" podCreationTimestamp="2025-06-13 04:51:29 +0000 UTC" firstStartedPulling="2025-06-13 04:51:30.707005034 +0000 UTC m=+49.153252497" lastFinishedPulling="2025-06-13 04:51:55.479369473 +0000 UTC m=+73.925616936" observedRunningTime="2025-06-13 04:51:56.010575505 +0000 UTC m=+74.456822968" watchObservedRunningTime="2025-06-13 04:51:56.011128381 +0000 UTC m=+74.457375864" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.053864 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqcmw" podStartSLOduration=3.133512854 podStartE2EDuration="31.053845306s" podCreationTimestamp="2025-06-13 04:51:25 +0000 UTC" firstStartedPulling="2025-06-13 04:51:27.506931392 +0000 UTC m=+45.953178895" lastFinishedPulling="2025-06-13 04:51:55.427263884 +0000 UTC m=+73.873511347" observedRunningTime="2025-06-13 04:51:56.051642642 +0000 UTC m=+74.497890105" watchObservedRunningTime="2025-06-13 04:51:56.053845306 +0000 UTC m=+74.500092769" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.102679 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.102738 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.282541 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" path="/var/lib/kubelet/pods/a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99/volumes" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.329698 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.329747 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.984207 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerStarted","Data":"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4"} Jun 13 04:51:56 crc kubenswrapper[4894]: I0613 04:51:56.986556 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerStarted","Data":"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff"} Jun 13 04:51:57 crc kubenswrapper[4894]: I0613 04:51:57.026533 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqxnn" podStartSLOduration=3.193690679 podStartE2EDuration="31.026517672s" podCreationTimestamp="2025-06-13 04:51:26 +0000 UTC" firstStartedPulling="2025-06-13 04:51:28.596332308 +0000 UTC m=+47.042579771" lastFinishedPulling="2025-06-13 04:51:56.429159301 +0000 UTC m=+74.875406764" observedRunningTime="2025-06-13 04:51:57.00657963 +0000 UTC m=+75.452827093" watchObservedRunningTime="2025-06-13 04:51:57.026517672 +0000 UTC m=+75.472765135" Jun 13 04:51:57 crc kubenswrapper[4894]: I0613 04:51:57.028842 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8bkp" podStartSLOduration=4.297520141 podStartE2EDuration="29.028834679s" podCreationTimestamp="2025-06-13 04:51:28 +0000 UTC" firstStartedPulling="2025-06-13 04:51:31.740346203 +0000 UTC m=+50.186593666" lastFinishedPulling="2025-06-13 04:51:56.471660741 +0000 UTC m=+74.917908204" observedRunningTime="2025-06-13 04:51:57.025053909 +0000 UTC m=+75.471301372" watchObservedRunningTime="2025-06-13 04:51:57.028834679 +0000 UTC m=+75.475082142" Jun 13 04:51:57 crc kubenswrapper[4894]: I0613 04:51:57.249521 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rd5qz" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="registry-server" probeResult="failure" output=< Jun 13 04:51:57 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 04:51:57 crc kubenswrapper[4894]: > Jun 13 04:51:57 crc kubenswrapper[4894]: I0613 04:51:57.365220 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vqcmw" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="registry-server" probeResult="failure" output=< Jun 13 04:51:57 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 04:51:57 crc kubenswrapper[4894]: > Jun 13 04:51:57 crc kubenswrapper[4894]: I0613 04:51:57.386009 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm76g" Jun 13 04:51:58 crc kubenswrapper[4894]: I0613 04:51:58.309945 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jun 13 04:51:59 crc kubenswrapper[4894]: I0613 04:51:59.159868 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:59 crc kubenswrapper[4894]: I0613 04:51:59.161911 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:51:59 crc kubenswrapper[4894]: I0613 04:51:59.517487 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:51:59 crc kubenswrapper[4894]: I0613 04:51:59.518238 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:52:00 crc kubenswrapper[4894]: I0613 04:52:00.204051 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r8bkp" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="registry-server" probeResult="failure" output=< Jun 13 04:52:00 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 04:52:00 crc kubenswrapper[4894]: > Jun 13 04:52:00 crc kubenswrapper[4894]: I0613 04:52:00.557809 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h8tmc" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="registry-server" probeResult="failure" output=< Jun 13 04:52:00 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 04:52:00 crc kubenswrapper[4894]: > Jun 13 04:52:05 crc kubenswrapper[4894]: I0613 04:52:05.301168 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.184912 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.214063 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.214030977 podStartE2EDuration="1.214030977s" podCreationTimestamp="2025-06-13 04:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:52:06.207581389 +0000 UTC m=+84.653828892" watchObservedRunningTime="2025-06-13 04:52:06.214030977 +0000 UTC m=+84.660278470" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.255367 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.379292 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.440695 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.575873 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.575968 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:06 crc kubenswrapper[4894]: I0613 04:52:06.641075 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:07 crc kubenswrapper[4894]: I0613 04:52:07.119691 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.058869 4894 generic.go:334] "Generic (PLEG): container finished" podID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerID="9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604" exitCode=0 Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.059276 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerDied","Data":"9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604"} Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.063836 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerDied","Data":"79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671"} Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.063739 4894 generic.go:334] "Generic (PLEG): container finished" podID="78966571-5be7-4b00-a993-39b3158ee935" containerID="79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671" exitCode=0 Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.069639 4894 generic.go:334] "Generic (PLEG): container finished" podID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerID="66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846" exitCode=0 Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.069733 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerDied","Data":"66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846"} Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.250488 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.250930 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqcmw" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="registry-server" containerID="cri-o://3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f" gracePeriod=2 Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.744854 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.871667 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f42hl\" (UniqueName: \"kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl\") pod \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.871740 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content\") pod \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.871777 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities\") pod \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\" (UID: \"9fd0e39e-cf29-4fa4-9778-84bb3d255d82\") " Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.872820 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities" (OuterVolumeSpecName: "utilities") pod "9fd0e39e-cf29-4fa4-9778-84bb3d255d82" (UID: "9fd0e39e-cf29-4fa4-9778-84bb3d255d82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.878187 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl" (OuterVolumeSpecName: "kube-api-access-f42hl") pod "9fd0e39e-cf29-4fa4-9778-84bb3d255d82" (UID: "9fd0e39e-cf29-4fa4-9778-84bb3d255d82"). InnerVolumeSpecName "kube-api-access-f42hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.910432 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fd0e39e-cf29-4fa4-9778-84bb3d255d82" (UID: "9fd0e39e-cf29-4fa4-9778-84bb3d255d82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.972963 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f42hl\" (UniqueName: \"kubernetes.io/projected/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-kube-api-access-f42hl\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.972996 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:08 crc kubenswrapper[4894]: I0613 04:52:08.973006 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd0e39e-cf29-4fa4-9778-84bb3d255d82-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.077348 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqcmw" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.077371 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerDied","Data":"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f"} Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.077464 4894 scope.go:117] "RemoveContainer" containerID="3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.078786 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerID="3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f" exitCode=0 Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.079180 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqcmw" event={"ID":"9fd0e39e-cf29-4fa4-9778-84bb3d255d82","Type":"ContainerDied","Data":"2e1a6462f6b663cb0711276d627d0e2d329cfb7a84c4cf3842f58d375ba8e850"} Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.081603 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerStarted","Data":"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722"} Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.087890 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerStarted","Data":"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab"} Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.091530 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerStarted","Data":"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10"} Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.092740 4894 scope.go:117] "RemoveContainer" containerID="ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.114266 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xd5m5" podStartSLOduration=2.217905461 podStartE2EDuration="41.114249893s" podCreationTimestamp="2025-06-13 04:51:28 +0000 UTC" firstStartedPulling="2025-06-13 04:51:29.625327261 +0000 UTC m=+48.071574724" lastFinishedPulling="2025-06-13 04:52:08.521671683 +0000 UTC m=+86.967919156" observedRunningTime="2025-06-13 04:52:09.112228234 +0000 UTC m=+87.558475687" watchObservedRunningTime="2025-06-13 04:52:09.114249893 +0000 UTC m=+87.560497356" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.122894 4894 scope.go:117] "RemoveContainer" containerID="61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.132606 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r6hfb" podStartSLOduration=3.298214937 podStartE2EDuration="42.132586667s" podCreationTimestamp="2025-06-13 04:51:27 +0000 UTC" firstStartedPulling="2025-06-13 04:51:29.673767198 +0000 UTC m=+48.120014661" lastFinishedPulling="2025-06-13 04:52:08.508138898 +0000 UTC m=+86.954386391" observedRunningTime="2025-06-13 04:52:09.130910589 +0000 UTC m=+87.577158052" watchObservedRunningTime="2025-06-13 04:52:09.132586667 +0000 UTC m=+87.578834140" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.145165 4894 scope.go:117] "RemoveContainer" containerID="3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f" Jun 13 04:52:09 crc kubenswrapper[4894]: E0613 04:52:09.145645 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f\": container with ID starting with 3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f not found: ID does not exist" containerID="3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.145770 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f"} err="failed to get container status \"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f\": rpc error: code = NotFound desc = could not find container \"3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f\": container with ID starting with 3579d620f9b3a390c2db2748bdeba181c7bab4402fdd9342160e86efb8e15a4f not found: ID does not exist" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.145872 4894 scope.go:117] "RemoveContainer" containerID="ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556" Jun 13 04:52:09 crc kubenswrapper[4894]: E0613 04:52:09.146406 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556\": container with ID starting with ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556 not found: ID does not exist" containerID="ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.146439 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556"} err="failed to get container status \"ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556\": rpc error: code = NotFound desc = could not find container \"ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556\": container with ID starting with ff24c2d939c913424707df3af0949bf1f4edbe88913866935f05e8160c934556 not found: ID does not exist" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.146475 4894 scope.go:117] "RemoveContainer" containerID="61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f" Jun 13 04:52:09 crc kubenswrapper[4894]: E0613 04:52:09.146765 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f\": container with ID starting with 61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f not found: ID does not exist" containerID="61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.146791 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f"} err="failed to get container status \"61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f\": rpc error: code = NotFound desc = could not find container \"61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f\": container with ID starting with 61709947a2d4275d4f66f5fe3dfa445866ea7db809319995b43d400c3c1cfa6f not found: ID does not exist" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.157267 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47h94" podStartSLOduration=2.993513931 podStartE2EDuration="44.157252087s" podCreationTimestamp="2025-06-13 04:51:25 +0000 UTC" firstStartedPulling="2025-06-13 04:51:27.505412738 +0000 UTC m=+45.951660201" lastFinishedPulling="2025-06-13 04:52:08.669150884 +0000 UTC m=+87.115398357" observedRunningTime="2025-06-13 04:52:09.156215707 +0000 UTC m=+87.602463170" watchObservedRunningTime="2025-06-13 04:52:09.157252087 +0000 UTC m=+87.603499560" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.166507 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.171314 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqcmw"] Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.203323 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.244541 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.555491 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:52:09 crc kubenswrapper[4894]: I0613 04:52:09.603769 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:52:10 crc kubenswrapper[4894]: I0613 04:52:10.290995 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" path="/var/lib/kubelet/pods/9fd0e39e-cf29-4fa4-9778-84bb3d255d82/volumes" Jun 13 04:52:10 crc kubenswrapper[4894]: I0613 04:52:10.649609 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:52:10 crc kubenswrapper[4894]: I0613 04:52:10.650003 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqxnn" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="registry-server" containerID="cri-o://4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4" gracePeriod=2 Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.085418 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.109672 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities\") pod \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.109722 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r47px\" (UniqueName: \"kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px\") pod \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.109782 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content\") pod \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\" (UID: \"e6291b0c-010c-4cf4-8df4-9007de0e4f35\") " Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.115780 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities" (OuterVolumeSpecName: "utilities") pod "e6291b0c-010c-4cf4-8df4-9007de0e4f35" (UID: "e6291b0c-010c-4cf4-8df4-9007de0e4f35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.123531 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px" (OuterVolumeSpecName: "kube-api-access-r47px") pod "e6291b0c-010c-4cf4-8df4-9007de0e4f35" (UID: "e6291b0c-010c-4cf4-8df4-9007de0e4f35"). InnerVolumeSpecName "kube-api-access-r47px". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.124441 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqxnn" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.124481 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerDied","Data":"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4"} Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.124526 4894 scope.go:117] "RemoveContainer" containerID="4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.124422 4894 generic.go:334] "Generic (PLEG): container finished" podID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerID="4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4" exitCode=0 Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.125101 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqxnn" event={"ID":"e6291b0c-010c-4cf4-8df4-9007de0e4f35","Type":"ContainerDied","Data":"394e818ce3836ddf7b1432dfbcb02052176c94bc13e4bf44db42ef1a17ab3553"} Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.160277 4894 scope.go:117] "RemoveContainer" containerID="43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.180260 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6291b0c-010c-4cf4-8df4-9007de0e4f35" (UID: "e6291b0c-010c-4cf4-8df4-9007de0e4f35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.188818 4894 scope.go:117] "RemoveContainer" containerID="476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.210790 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.210827 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6291b0c-010c-4cf4-8df4-9007de0e4f35-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.210840 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r47px\" (UniqueName: \"kubernetes.io/projected/e6291b0c-010c-4cf4-8df4-9007de0e4f35-kube-api-access-r47px\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.221473 4894 scope.go:117] "RemoveContainer" containerID="4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4" Jun 13 04:52:11 crc kubenswrapper[4894]: E0613 04:52:11.222627 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4\": container with ID starting with 4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4 not found: ID does not exist" containerID="4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.222740 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4"} err="failed to get container status \"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4\": rpc error: code = NotFound desc = could not find container \"4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4\": container with ID starting with 4570acb62f25e6db6a53b784a2ed895b91ee887e53324524e1af3705405f0ef4 not found: ID does not exist" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.222779 4894 scope.go:117] "RemoveContainer" containerID="43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de" Jun 13 04:52:11 crc kubenswrapper[4894]: E0613 04:52:11.225272 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de\": container with ID starting with 43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de not found: ID does not exist" containerID="43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.225345 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de"} err="failed to get container status \"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de\": rpc error: code = NotFound desc = could not find container \"43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de\": container with ID starting with 43a742a0ebc519dbf868c5171989f31e879cc2e8707d7ab633fb4b91fad8b8de not found: ID does not exist" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.225417 4894 scope.go:117] "RemoveContainer" containerID="476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79" Jun 13 04:52:11 crc kubenswrapper[4894]: E0613 04:52:11.226159 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79\": container with ID starting with 476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79 not found: ID does not exist" containerID="476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.226208 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79"} err="failed to get container status \"476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79\": rpc error: code = NotFound desc = could not find container \"476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79\": container with ID starting with 476ffde45e306a7307df26757cf823dacefccca735699aee85d619364b1cbf79 not found: ID does not exist" Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.456593 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:52:11 crc kubenswrapper[4894]: I0613 04:52:11.461391 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqxnn"] Jun 13 04:52:12 crc kubenswrapper[4894]: I0613 04:52:12.285068 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" path="/var/lib/kubelet/pods/e6291b0c-010c-4cf4-8df4-9007de0e4f35/volumes" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.055598 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.056033 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h8tmc" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="registry-server" containerID="cri-o://a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6" gracePeriod=2 Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.462112 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.642039 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities\") pod \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.642219 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content\") pod \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.642349 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2cpc\" (UniqueName: \"kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc\") pod \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\" (UID: \"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc\") " Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.643574 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities" (OuterVolumeSpecName: "utilities") pod "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" (UID: "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.644635 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.653130 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc" (OuterVolumeSpecName: "kube-api-access-l2cpc") pod "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" (UID: "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc"). InnerVolumeSpecName "kube-api-access-l2cpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.735077 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" (UID: "2b8ec7b5-d500-4412-b84f-c5602dc2c6cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.746801 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:13 crc kubenswrapper[4894]: I0613 04:52:13.746872 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2cpc\" (UniqueName: \"kubernetes.io/projected/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc-kube-api-access-l2cpc\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.153336 4894 generic.go:334] "Generic (PLEG): container finished" podID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerID="a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6" exitCode=0 Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.153935 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerDied","Data":"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6"} Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.153986 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8tmc" event={"ID":"2b8ec7b5-d500-4412-b84f-c5602dc2c6cc","Type":"ContainerDied","Data":"808fb35e58aa0f06019e9c1feafe55d81ceb5ef8d39cbdf5d68af2e07029ec95"} Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.154022 4894 scope.go:117] "RemoveContainer" containerID="a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.154221 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8tmc" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.180620 4894 scope.go:117] "RemoveContainer" containerID="853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.207599 4894 scope.go:117] "RemoveContainer" containerID="a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.215806 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.218594 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h8tmc"] Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.246585 4894 scope.go:117] "RemoveContainer" containerID="a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6" Jun 13 04:52:14 crc kubenswrapper[4894]: E0613 04:52:14.247080 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6\": container with ID starting with a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6 not found: ID does not exist" containerID="a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.247114 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6"} err="failed to get container status \"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6\": rpc error: code = NotFound desc = could not find container \"a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6\": container with ID starting with a468fe94c4c42e6aa963162431b75bd82cf7354be8b183e24fab5b99ced74ad6 not found: ID does not exist" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.247140 4894 scope.go:117] "RemoveContainer" containerID="853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d" Jun 13 04:52:14 crc kubenswrapper[4894]: E0613 04:52:14.248553 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d\": container with ID starting with 853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d not found: ID does not exist" containerID="853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.248575 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d"} err="failed to get container status \"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d\": rpc error: code = NotFound desc = could not find container \"853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d\": container with ID starting with 853d711566c5b8d201d9a9d8611ec2d805abf9e83440256e856299cbfde8fd4d not found: ID does not exist" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.248588 4894 scope.go:117] "RemoveContainer" containerID="a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0" Jun 13 04:52:14 crc kubenswrapper[4894]: E0613 04:52:14.249944 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0\": container with ID starting with a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0 not found: ID does not exist" containerID="a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.250018 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0"} err="failed to get container status \"a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0\": rpc error: code = NotFound desc = could not find container \"a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0\": container with ID starting with a1d01c10bd44edd19f77cece1c3121aee1ea192a3837889fbfbb862014eb01e0 not found: ID does not exist" Jun 13 04:52:14 crc kubenswrapper[4894]: I0613 04:52:14.288520 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" path="/var/lib/kubelet/pods/2b8ec7b5-d500-4412-b84f-c5602dc2c6cc/volumes" Jun 13 04:52:15 crc kubenswrapper[4894]: I0613 04:52:15.902996 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:52:15 crc kubenswrapper[4894]: I0613 04:52:15.904284 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:52:15 crc kubenswrapper[4894]: I0613 04:52:15.955338 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:52:16 crc kubenswrapper[4894]: I0613 04:52:16.252071 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.104505 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.105078 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.169896 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.236596 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.797022 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.797244 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:18 crc kubenswrapper[4894]: I0613 04:52:18.836813 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:19 crc kubenswrapper[4894]: I0613 04:52:19.231105 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:20 crc kubenswrapper[4894]: I0613 04:52:20.842568 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.202519 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xd5m5" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="registry-server" containerID="cri-o://f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab" gracePeriod=2 Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.511953 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.666510 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities\") pod \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.666575 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5hx\" (UniqueName: \"kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx\") pod \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.666692 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content\") pod \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\" (UID: \"d20f42b3-f716-4c1e-8720-bf1f6e43fc97\") " Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.667726 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities" (OuterVolumeSpecName: "utilities") pod "d20f42b3-f716-4c1e-8720-bf1f6e43fc97" (UID: "d20f42b3-f716-4c1e-8720-bf1f6e43fc97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.675675 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d20f42b3-f716-4c1e-8720-bf1f6e43fc97" (UID: "d20f42b3-f716-4c1e-8720-bf1f6e43fc97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.679391 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx" (OuterVolumeSpecName: "kube-api-access-mx5hx") pod "d20f42b3-f716-4c1e-8720-bf1f6e43fc97" (UID: "d20f42b3-f716-4c1e-8720-bf1f6e43fc97"). InnerVolumeSpecName "kube-api-access-mx5hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.768046 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.768081 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:21 crc kubenswrapper[4894]: I0613 04:52:21.768091 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5hx\" (UniqueName: \"kubernetes.io/projected/d20f42b3-f716-4c1e-8720-bf1f6e43fc97-kube-api-access-mx5hx\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.209207 4894 generic.go:334] "Generic (PLEG): container finished" podID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerID="f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab" exitCode=0 Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.209253 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerDied","Data":"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab"} Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.209311 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd5m5" event={"ID":"d20f42b3-f716-4c1e-8720-bf1f6e43fc97","Type":"ContainerDied","Data":"3a28ca645993d4fd867678ac809c92e28eba1ebd4a2933bdba4c686eeaff738a"} Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.209331 4894 scope.go:117] "RemoveContainer" containerID="f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.210772 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd5m5" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.232809 4894 scope.go:117] "RemoveContainer" containerID="66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.241846 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.246636 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd5m5"] Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.250945 4894 scope.go:117] "RemoveContainer" containerID="2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.272902 4894 scope.go:117] "RemoveContainer" containerID="f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab" Jun 13 04:52:22 crc kubenswrapper[4894]: E0613 04:52:22.275269 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab\": container with ID starting with f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab not found: ID does not exist" containerID="f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.275312 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab"} err="failed to get container status \"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab\": rpc error: code = NotFound desc = could not find container \"f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab\": container with ID starting with f20ea546dd4ee35f7f4cff3e44fabaa10caebd034b3f907fd889d950b77f5bab not found: ID does not exist" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.275340 4894 scope.go:117] "RemoveContainer" containerID="66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846" Jun 13 04:52:22 crc kubenswrapper[4894]: E0613 04:52:22.276079 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846\": container with ID starting with 66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846 not found: ID does not exist" containerID="66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.276111 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846"} err="failed to get container status \"66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846\": rpc error: code = NotFound desc = could not find container \"66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846\": container with ID starting with 66ef99dfd64033ecb50d7beab02589993b9b9f2a44d3a056fddbbd27d305f846 not found: ID does not exist" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.276136 4894 scope.go:117] "RemoveContainer" containerID="2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319" Jun 13 04:52:22 crc kubenswrapper[4894]: E0613 04:52:22.276482 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319\": container with ID starting with 2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319 not found: ID does not exist" containerID="2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.276527 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319"} err="failed to get container status \"2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319\": rpc error: code = NotFound desc = could not find container \"2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319\": container with ID starting with 2f1bea59464cd54dd579374405a4aade20c04d0995d41e1bcc8f87f35cc9a319 not found: ID does not exist" Jun 13 04:52:22 crc kubenswrapper[4894]: I0613 04:52:22.282986 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" path="/var/lib/kubelet/pods/d20f42b3-f716-4c1e-8720-bf1f6e43fc97/volumes" Jun 13 04:52:29 crc kubenswrapper[4894]: I0613 04:52:29.611566 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:52:54 crc kubenswrapper[4894]: I0613 04:52:54.642988 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" containerID="cri-o://d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1" gracePeriod=15 Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.157071 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.194854 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5c5747b78c-d2vxd"] Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195149 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195178 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195199 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebc74e0-474a-4a41-9066-8883d93711f3" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195211 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebc74e0-474a-4a41-9066-8883d93711f3" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195226 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195237 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195253 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195264 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195278 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195289 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195306 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efc2ace-7464-4a68-a22a-046404032bd1" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195317 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efc2ace-7464-4a68-a22a-046404032bd1" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195335 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195346 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195358 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195369 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195385 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195395 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195406 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195417 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195433 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195444 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="extract-utilities" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195460 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195470 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195481 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195493 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195504 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195515 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195531 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195542 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.195555 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195565 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="extract-content" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195741 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55984d5-dfb4-4a7e-b9cb-ef7712f2eb99" containerName="kube-multus-additional-cni-plugins" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195764 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6291b0c-010c-4cf4-8df4-9007de0e4f35" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195780 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195792 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20f42b3-f716-4c1e-8720-bf1f6e43fc97" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195807 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efc2ace-7464-4a68-a22a-046404032bd1" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195822 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebc74e0-474a-4a41-9066-8883d93711f3" containerName="pruner" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195839 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8ec7b5-d500-4412-b84f-c5602dc2c6cc" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195855 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd0e39e-cf29-4fa4-9778-84bb3d255d82" containerName="registry-server" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195916 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.195977 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196007 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196043 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28bcf\" (UniqueName: \"kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196071 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196093 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196115 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196144 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196170 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196228 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196253 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196276 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196302 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196328 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session\") pod \"45928378-9580-49fd-8831-f89923a0b98e\" (UID: \"45928378-9580-49fd-8831-f89923a0b98e\") " Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.196450 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.197472 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.197487 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.203827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.206827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.209251 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.213796 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.230329 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.230624 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.231537 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.232040 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.236110 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.248426 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.248982 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.254553 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c5747b78c-d2vxd"] Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.258484 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf" (OuterVolumeSpecName: "kube-api-access-28bcf") pod "45928378-9580-49fd-8831-f89923a0b98e" (UID: "45928378-9580-49fd-8831-f89923a0b98e"). InnerVolumeSpecName "kube-api-access-28bcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297173 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297365 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297475 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-login\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297544 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297623 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297744 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-session\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297797 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbb9n\" (UniqueName: \"kubernetes.io/projected/27e8fb2c-4016-420b-9716-a62423a441de-kube-api-access-xbb9n\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.297942 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298021 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-error\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298112 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298240 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27e8fb2c-4016-420b-9716-a62423a441de-audit-dir\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298307 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-audit-policies\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298390 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298546 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298708 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298732 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298748 4894 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-audit-policies\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298761 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298775 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298789 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28bcf\" (UniqueName: \"kubernetes.io/projected/45928378-9580-49fd-8831-f89923a0b98e-kube-api-access-28bcf\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298803 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298816 4894 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45928378-9580-49fd-8831-f89923a0b98e-audit-dir\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298829 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298842 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298855 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298867 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298879 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.298891 4894 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45928378-9580-49fd-8831-f89923a0b98e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400165 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400254 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-error\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400310 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400378 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27e8fb2c-4016-420b-9716-a62423a441de-audit-dir\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400464 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-audit-policies\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400518 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400562 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400604 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400641 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400760 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400796 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400855 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-login\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400888 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-session\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.400938 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbb9n\" (UniqueName: \"kubernetes.io/projected/27e8fb2c-4016-420b-9716-a62423a441de-kube-api-access-xbb9n\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.402298 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.402319 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27e8fb2c-4016-420b-9716-a62423a441de-audit-dir\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.402980 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.403030 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.403645 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.403742 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.405095 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27e8fb2c-4016-420b-9716-a62423a441de-audit-policies\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.405758 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.406045 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-error\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.406321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.406778 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.409134 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-system-session\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.409852 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27e8fb2c-4016-420b-9716-a62423a441de-v4-0-config-user-template-login\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.420615 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbb9n\" (UniqueName: \"kubernetes.io/projected/27e8fb2c-4016-420b-9716-a62423a441de-kube-api-access-xbb9n\") pod \"oauth-openshift-5c5747b78c-d2vxd\" (UID: \"27e8fb2c-4016-420b-9716-a62423a441de\") " pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.459339 4894 generic.go:334] "Generic (PLEG): container finished" podID="45928378-9580-49fd-8831-f89923a0b98e" containerID="d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1" exitCode=0 Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.459397 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" event={"ID":"45928378-9580-49fd-8831-f89923a0b98e","Type":"ContainerDied","Data":"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1"} Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.459437 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" event={"ID":"45928378-9580-49fd-8831-f89923a0b98e","Type":"ContainerDied","Data":"f579f19a9a3c9cfeb2b79c8689d9e859ae95f113ec98aa7a3752a7ab21169d89"} Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.459463 4894 scope.go:117] "RemoveContainer" containerID="d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.459618 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.481259 4894 scope.go:117] "RemoveContainer" containerID="d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1" Jun 13 04:52:55 crc kubenswrapper[4894]: E0613 04:52:55.482324 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1\": container with ID starting with d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1 not found: ID does not exist" containerID="d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.482360 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1"} err="failed to get container status \"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1\": rpc error: code = NotFound desc = could not find container \"d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1\": container with ID starting with d6c57e78877057d0c1a6f2749ecc94513df1edeb604c5c881dbfe4d1297198a1 not found: ID does not exist" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.504542 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.509408 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwjkb"] Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.573841 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:55 crc kubenswrapper[4894]: I0613 04:52:55.809044 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c5747b78c-d2vxd"] Jun 13 04:52:55 crc kubenswrapper[4894]: W0613 04:52:55.819441 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e8fb2c_4016_420b_9716_a62423a441de.slice/crio-18c01654d8640d6b27c3bea7ba9358e1f19794be46d3aab1dbb87d8e501432dd WatchSource:0}: Error finding container 18c01654d8640d6b27c3bea7ba9358e1f19794be46d3aab1dbb87d8e501432dd: Status 404 returned error can't find the container with id 18c01654d8640d6b27c3bea7ba9358e1f19794be46d3aab1dbb87d8e501432dd Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.045329 4894 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rwjkb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.045645 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rwjkb" podUID="45928378-9580-49fd-8831-f89923a0b98e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.296944 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45928378-9580-49fd-8831-f89923a0b98e" path="/var/lib/kubelet/pods/45928378-9580-49fd-8831-f89923a0b98e/volumes" Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.473359 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" event={"ID":"27e8fb2c-4016-420b-9716-a62423a441de","Type":"ContainerStarted","Data":"23369fa566ccf760ff1941e06314bab355f413522c84fcff3b6c14a937c3be4f"} Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.473732 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" event={"ID":"27e8fb2c-4016-420b-9716-a62423a441de","Type":"ContainerStarted","Data":"18c01654d8640d6b27c3bea7ba9358e1f19794be46d3aab1dbb87d8e501432dd"} Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.473895 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.512066 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" podStartSLOduration=27.512045611 podStartE2EDuration="27.512045611s" podCreationTimestamp="2025-06-13 04:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:52:56.509266008 +0000 UTC m=+134.955513551" watchObservedRunningTime="2025-06-13 04:52:56.512045611 +0000 UTC m=+134.958293104" Jun 13 04:52:56 crc kubenswrapper[4894]: I0613 04:52:56.622909 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5c5747b78c-d2vxd" Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.798189 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.799605 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-47h94" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="registry-server" containerID="cri-o://bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10" gracePeriod=30 Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.809271 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.819482 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.819726 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" containerID="cri-o://8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300" gracePeriod=30 Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.820368 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rd5qz" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="registry-server" containerID="cri-o://cc2282362d218f6ad76c98b18adb461fccf1c34545e3a2ecdc1e27c4c4bc06e1" gracePeriod=30 Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.826253 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.826844 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r6hfb" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="registry-server" containerID="cri-o://a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722" gracePeriod=30 Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.861492 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cgms"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.862326 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.876996 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.877258 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8bkp" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="registry-server" containerID="cri-o://03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff" gracePeriod=30 Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.899722 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cgms"] Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.960749 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zpf\" (UniqueName: \"kubernetes.io/projected/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-kube-api-access-h6zpf\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.960798 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:06 crc kubenswrapper[4894]: I0613 04:53:06.960852 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.062142 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zpf\" (UniqueName: \"kubernetes.io/projected/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-kube-api-access-h6zpf\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.062208 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.062295 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.066730 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.079819 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.082841 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zpf\" (UniqueName: \"kubernetes.io/projected/b31c24cf-e9ca-4ae7-aba9-d151e098ae5c-kube-api-access-h6zpf\") pod \"marketplace-operator-79b997595-7cgms\" (UID: \"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.184367 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.216827 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.264513 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities\") pod \"0cbd7e4c-94e5-49ed-9c93-674f70506056\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.264558 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content\") pod \"0cbd7e4c-94e5-49ed-9c93-674f70506056\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.264591 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q87r\" (UniqueName: \"kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r\") pod \"0cbd7e4c-94e5-49ed-9c93-674f70506056\" (UID: \"0cbd7e4c-94e5-49ed-9c93-674f70506056\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.271157 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r" (OuterVolumeSpecName: "kube-api-access-8q87r") pod "0cbd7e4c-94e5-49ed-9c93-674f70506056" (UID: "0cbd7e4c-94e5-49ed-9c93-674f70506056"). InnerVolumeSpecName "kube-api-access-8q87r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.275904 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cbd7e4c-94e5-49ed-9c93-674f70506056" (UID: "0cbd7e4c-94e5-49ed-9c93-674f70506056"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.282773 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities" (OuterVolumeSpecName: "utilities") pod "0cbd7e4c-94e5-49ed-9c93-674f70506056" (UID: "0cbd7e4c-94e5-49ed-9c93-674f70506056"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.301623 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.307430 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.341256 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370137 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca\") pod \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370176 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities\") pod \"78966571-5be7-4b00-a993-39b3158ee935\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370209 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27trx\" (UniqueName: \"kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx\") pod \"78966571-5be7-4b00-a993-39b3158ee935\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370300 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content\") pod \"78966571-5be7-4b00-a993-39b3158ee935\" (UID: \"78966571-5be7-4b00-a993-39b3158ee935\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370321 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbzk\" (UniqueName: \"kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk\") pod \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370356 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics\") pod \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\" (UID: \"cf11c387-9f91-4c1c-aaea-69f41a35d30c\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370375 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content\") pod \"318542e4-4246-4790-b652-4a173323ec4f\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370448 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d65dc\" (UniqueName: \"kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc\") pod \"318542e4-4246-4790-b652-4a173323ec4f\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370483 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities\") pod \"318542e4-4246-4790-b652-4a173323ec4f\" (UID: \"318542e4-4246-4790-b652-4a173323ec4f\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370761 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370776 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cbd7e4c-94e5-49ed-9c93-674f70506056-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.370788 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q87r\" (UniqueName: \"kubernetes.io/projected/0cbd7e4c-94e5-49ed-9c93-674f70506056-kube-api-access-8q87r\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.373646 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cf11c387-9f91-4c1c-aaea-69f41a35d30c" (UID: "cf11c387-9f91-4c1c-aaea-69f41a35d30c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.374454 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities" (OuterVolumeSpecName: "utilities") pod "78966571-5be7-4b00-a993-39b3158ee935" (UID: "78966571-5be7-4b00-a993-39b3158ee935"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.376962 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities" (OuterVolumeSpecName: "utilities") pod "318542e4-4246-4790-b652-4a173323ec4f" (UID: "318542e4-4246-4790-b652-4a173323ec4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.378990 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx" (OuterVolumeSpecName: "kube-api-access-27trx") pod "78966571-5be7-4b00-a993-39b3158ee935" (UID: "78966571-5be7-4b00-a993-39b3158ee935"). InnerVolumeSpecName "kube-api-access-27trx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.382567 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cf11c387-9f91-4c1c-aaea-69f41a35d30c" (UID: "cf11c387-9f91-4c1c-aaea-69f41a35d30c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.395851 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc" (OuterVolumeSpecName: "kube-api-access-d65dc") pod "318542e4-4246-4790-b652-4a173323ec4f" (UID: "318542e4-4246-4790-b652-4a173323ec4f"). InnerVolumeSpecName "kube-api-access-d65dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.400757 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk" (OuterVolumeSpecName: "kube-api-access-lwbzk") pod "cf11c387-9f91-4c1c-aaea-69f41a35d30c" (UID: "cf11c387-9f91-4c1c-aaea-69f41a35d30c"). InnerVolumeSpecName "kube-api-access-lwbzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.401987 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78966571-5be7-4b00-a993-39b3158ee935" (UID: "78966571-5be7-4b00-a993-39b3158ee935"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.457678 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cgms"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471528 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d65dc\" (UniqueName: \"kubernetes.io/projected/318542e4-4246-4790-b652-4a173323ec4f-kube-api-access-d65dc\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471551 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471561 4894 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471571 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471581 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27trx\" (UniqueName: \"kubernetes.io/projected/78966571-5be7-4b00-a993-39b3158ee935-kube-api-access-27trx\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471589 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78966571-5be7-4b00-a993-39b3158ee935-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471599 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwbzk\" (UniqueName: \"kubernetes.io/projected/cf11c387-9f91-4c1c-aaea-69f41a35d30c-kube-api-access-lwbzk\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.471608 4894 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf11c387-9f91-4c1c-aaea-69f41a35d30c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.477774 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "318542e4-4246-4790-b652-4a173323ec4f" (UID: "318542e4-4246-4790-b652-4a173323ec4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.548672 4894 generic.go:334] "Generic (PLEG): container finished" podID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerID="cc2282362d218f6ad76c98b18adb461fccf1c34545e3a2ecdc1e27c4c4bc06e1" exitCode=0 Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.548691 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerDied","Data":"cc2282362d218f6ad76c98b18adb461fccf1c34545e3a2ecdc1e27c4c4bc06e1"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.550472 4894 generic.go:334] "Generic (PLEG): container finished" podID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerID="8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300" exitCode=0 Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.550526 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" event={"ID":"cf11c387-9f91-4c1c-aaea-69f41a35d30c","Type":"ContainerDied","Data":"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.550543 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" event={"ID":"cf11c387-9f91-4c1c-aaea-69f41a35d30c","Type":"ContainerDied","Data":"abfe5dbfb1e156199ba0f6c8eabadb3f94bbb9b942a746ce14ab2c646d5b45e1"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.550559 4894 scope.go:117] "RemoveContainer" containerID="8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.550578 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.567636 4894 generic.go:334] "Generic (PLEG): container finished" podID="318542e4-4246-4790-b652-4a173323ec4f" containerID="03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff" exitCode=0 Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.567699 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerDied","Data":"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.567720 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bkp" event={"ID":"318542e4-4246-4790-b652-4a173323ec4f","Type":"ContainerDied","Data":"cf1f540c955b17d1dcd291376c2ef56eb7c4934b809d117ab14250108b3b0f4b"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.567835 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bkp" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.572085 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318542e4-4246-4790-b652-4a173323ec4f-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.580213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" event={"ID":"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c","Type":"ContainerStarted","Data":"d317b941fe0d03bcac41bd968171dcf5143f0310640fdac488328d2f5857a85c"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.596488 4894 scope.go:117] "RemoveContainer" containerID="8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.597088 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300\": container with ID starting with 8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300 not found: ID does not exist" containerID="8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.597142 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300"} err="failed to get container status \"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300\": rpc error: code = NotFound desc = could not find container \"8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300\": container with ID starting with 8ff2eeb4680b410a5be32c39304647e5afc9035e3310f7fada27b77936663300 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.597167 4894 scope.go:117] "RemoveContainer" containerID="03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.599768 4894 generic.go:334] "Generic (PLEG): container finished" podID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerID="a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722" exitCode=0 Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.600291 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6hfb" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.600304 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerDied","Data":"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.607388 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6hfb" event={"ID":"0cbd7e4c-94e5-49ed-9c93-674f70506056","Type":"ContainerDied","Data":"643ef98bd0d7da569f989a4b2c4f3177eae9b99dc291533ea36b6861d1da385f"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.610086 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.622421 4894 generic.go:334] "Generic (PLEG): container finished" podID="78966571-5be7-4b00-a993-39b3158ee935" containerID="bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10" exitCode=0 Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.622463 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerDied","Data":"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.622490 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47h94" event={"ID":"78966571-5be7-4b00-a993-39b3158ee935","Type":"ContainerDied","Data":"004e2cf2b94724be5023d6e4932361242e660074510338af261e2b960f274182"} Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.622551 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47h94" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.626994 4894 scope.go:117] "RemoveContainer" containerID="ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.635056 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tr7lv"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.654263 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.664805 4894 scope.go:117] "RemoveContainer" containerID="37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.667461 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8bkp"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.668549 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.670525 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.672992 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6hfb"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.685030 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.686838 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-47h94"] Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.691876 4894 scope.go:117] "RemoveContainer" containerID="03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.712279 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff\": container with ID starting with 03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff not found: ID does not exist" containerID="03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.712325 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff"} err="failed to get container status \"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff\": rpc error: code = NotFound desc = could not find container \"03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff\": container with ID starting with 03d5e16decff53d3f89c9e0477acb6076ebded5a148cced9b283736b9b0cacff not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.712352 4894 scope.go:117] "RemoveContainer" containerID="ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.713415 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae\": container with ID starting with ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae not found: ID does not exist" containerID="ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.713444 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae"} err="failed to get container status \"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae\": rpc error: code = NotFound desc = could not find container \"ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae\": container with ID starting with ab79ca30c461337e9fd988802d1ad8162e78c83b0a1aeace62b78b6f2188c5ae not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.713459 4894 scope.go:117] "RemoveContainer" containerID="37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.715496 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6\": container with ID starting with 37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6 not found: ID does not exist" containerID="37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.715520 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6"} err="failed to get container status \"37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6\": rpc error: code = NotFound desc = could not find container \"37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6\": container with ID starting with 37e430123c77e982c979c6f30ecf01663e23b62444f8b9cda0e492aaef1c2bb6 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.715535 4894 scope.go:117] "RemoveContainer" containerID="a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.729804 4894 scope.go:117] "RemoveContainer" containerID="9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.743935 4894 scope.go:117] "RemoveContainer" containerID="ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.757260 4894 scope.go:117] "RemoveContainer" containerID="a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.757623 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722\": container with ID starting with a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722 not found: ID does not exist" containerID="a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.757678 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722"} err="failed to get container status \"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722\": rpc error: code = NotFound desc = could not find container \"a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722\": container with ID starting with a770a97dcd0478f1f7554cc3d408218d515729541681e0ea4dd8f5b5742ba722 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.757711 4894 scope.go:117] "RemoveContainer" containerID="9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.758070 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604\": container with ID starting with 9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604 not found: ID does not exist" containerID="9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.758107 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604"} err="failed to get container status \"9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604\": rpc error: code = NotFound desc = could not find container \"9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604\": container with ID starting with 9c0030c1691c01ea50e26d552693d35ba12d836167fb395d8af73ec61a1bc604 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.758131 4894 scope.go:117] "RemoveContainer" containerID="ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.758358 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb\": container with ID starting with ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb not found: ID does not exist" containerID="ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.758382 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb"} err="failed to get container status \"ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb\": rpc error: code = NotFound desc = could not find container \"ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb\": container with ID starting with ca39d8539e969358ea67ae7d13e1fcf4f233d1362bcbffa971765dda6edde4fb not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.758397 4894 scope.go:117] "RemoveContainer" containerID="bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.773445 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities\") pod \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.773536 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content\") pod \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.773611 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h\") pod \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\" (UID: \"a88e73a7-e3d0-4015-9f77-748ea17f6e39\") " Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.774061 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities" (OuterVolumeSpecName: "utilities") pod "a88e73a7-e3d0-4015-9f77-748ea17f6e39" (UID: "a88e73a7-e3d0-4015-9f77-748ea17f6e39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.777232 4894 scope.go:117] "RemoveContainer" containerID="79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.778733 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h" (OuterVolumeSpecName: "kube-api-access-6l52h") pod "a88e73a7-e3d0-4015-9f77-748ea17f6e39" (UID: "a88e73a7-e3d0-4015-9f77-748ea17f6e39"). InnerVolumeSpecName "kube-api-access-6l52h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.793962 4894 scope.go:117] "RemoveContainer" containerID="6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.809413 4894 scope.go:117] "RemoveContainer" containerID="bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.810169 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10\": container with ID starting with bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10 not found: ID does not exist" containerID="bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.810270 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10"} err="failed to get container status \"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10\": rpc error: code = NotFound desc = could not find container \"bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10\": container with ID starting with bec18c498643861927ca88692183c37f090aaeb1f8538aae4d34dd9faf862e10 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.810354 4894 scope.go:117] "RemoveContainer" containerID="79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.810779 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671\": container with ID starting with 79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671 not found: ID does not exist" containerID="79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.810821 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671"} err="failed to get container status \"79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671\": rpc error: code = NotFound desc = could not find container \"79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671\": container with ID starting with 79dc978c9ae58d2e5d5e338eea642b7da6d022532653a171c0918999435bf671 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.810852 4894 scope.go:117] "RemoveContainer" containerID="6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92" Jun 13 04:53:07 crc kubenswrapper[4894]: E0613 04:53:07.811475 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92\": container with ID starting with 6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92 not found: ID does not exist" containerID="6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.811512 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92"} err="failed to get container status \"6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92\": rpc error: code = NotFound desc = could not find container \"6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92\": container with ID starting with 6da2b93d88f7300adcf269c1da5247a83070c113279103f168103536100afe92 not found: ID does not exist" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.820290 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a88e73a7-e3d0-4015-9f77-748ea17f6e39" (UID: "a88e73a7-e3d0-4015-9f77-748ea17f6e39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.875556 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l52h\" (UniqueName: \"kubernetes.io/projected/a88e73a7-e3d0-4015-9f77-748ea17f6e39-kube-api-access-6l52h\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.875601 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:07 crc kubenswrapper[4894]: I0613 04:53:07.875613 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88e73a7-e3d0-4015-9f77-748ea17f6e39-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.172435 4894 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tr7lv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.172506 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tr7lv" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204148 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b5kgb"] Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204334 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204347 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204353 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204361 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204369 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204375 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204387 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204392 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204402 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204407 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204414 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204421 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204428 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204434 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204441 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204447 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204455 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204461 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204470 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204476 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204483 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204490 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204497 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204504 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="extract-content" Jun 13 04:53:08 crc kubenswrapper[4894]: E0613 04:53:08.204529 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204535 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="extract-utilities" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204629 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="78966571-5be7-4b00-a993-39b3158ee935" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204641 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" containerName="marketplace-operator" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204663 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="318542e4-4246-4790-b652-4a173323ec4f" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204671 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.204678 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" containerName="registry-server" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.205353 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.208137 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.217994 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5kgb"] Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.283246 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbd7e4c-94e5-49ed-9c93-674f70506056" path="/var/lib/kubelet/pods/0cbd7e4c-94e5-49ed-9c93-674f70506056/volumes" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.283861 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318542e4-4246-4790-b652-4a173323ec4f" path="/var/lib/kubelet/pods/318542e4-4246-4790-b652-4a173323ec4f/volumes" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.284428 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78966571-5be7-4b00-a993-39b3158ee935" path="/var/lib/kubelet/pods/78966571-5be7-4b00-a993-39b3158ee935/volumes" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.285011 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf11c387-9f91-4c1c-aaea-69f41a35d30c" path="/var/lib/kubelet/pods/cf11c387-9f91-4c1c-aaea-69f41a35d30c/volumes" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.394754 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhwn\" (UniqueName: \"kubernetes.io/projected/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-kube-api-access-2jhwn\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.394845 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-catalog-content\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.394869 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-utilities\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.495961 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-catalog-content\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.496032 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-utilities\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.496112 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhwn\" (UniqueName: \"kubernetes.io/projected/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-kube-api-access-2jhwn\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.496943 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-utilities\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.497121 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-catalog-content\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.516565 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhwn\" (UniqueName: \"kubernetes.io/projected/45f0f708-2377-4d6f-915f-ae04b3ba4e4b-kube-api-access-2jhwn\") pod \"redhat-marketplace-b5kgb\" (UID: \"45f0f708-2377-4d6f-915f-ae04b3ba4e4b\") " pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.528602 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.633095 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd5qz" event={"ID":"a88e73a7-e3d0-4015-9f77-748ea17f6e39","Type":"ContainerDied","Data":"93ded20b072bcfc0207a34af38b9c8e54a3e073dbbf84d9790e2baa22b612315"} Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.633454 4894 scope.go:117] "RemoveContainer" containerID="cc2282362d218f6ad76c98b18adb461fccf1c34545e3a2ecdc1e27c4c4bc06e1" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.633161 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd5qz" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.642847 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" event={"ID":"b31c24cf-e9ca-4ae7-aba9-d151e098ae5c","Type":"ContainerStarted","Data":"33282919bf223edd8bff3454f2308c1bd13d23932d1c3b0f8afe4fe9b3dcd0c8"} Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.643056 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.649000 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.656743 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.674965 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rd5qz"] Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.685319 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7cgms" podStartSLOduration=2.685300581 podStartE2EDuration="2.685300581s" podCreationTimestamp="2025-06-13 04:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:53:08.685010063 +0000 UTC m=+147.131257516" watchObservedRunningTime="2025-06-13 04:53:08.685300581 +0000 UTC m=+147.131548044" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.687578 4894 scope.go:117] "RemoveContainer" containerID="d850497847391f8a8018f0089dcae8781d46543ea3e4d2b5f91526ce76a856cf" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.710885 4894 scope.go:117] "RemoveContainer" containerID="5bb4bef9d030e5c1c3cd6b528d931e48be19a628f67735e497666ef1eb6f0571" Jun 13 04:53:08 crc kubenswrapper[4894]: I0613 04:53:08.817579 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5kgb"] Jun 13 04:53:08 crc kubenswrapper[4894]: W0613 04:53:08.827219 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f0f708_2377_4d6f_915f_ae04b3ba4e4b.slice/crio-e6ad5458e66721260c2777b5fdb2558ae80dfdd9957fa53e427d41c744de1eb6 WatchSource:0}: Error finding container e6ad5458e66721260c2777b5fdb2558ae80dfdd9957fa53e427d41c744de1eb6: Status 404 returned error can't find the container with id e6ad5458e66721260c2777b5fdb2558ae80dfdd9957fa53e427d41c744de1eb6 Jun 13 04:53:09 crc kubenswrapper[4894]: I0613 04:53:09.651170 4894 generic.go:334] "Generic (PLEG): container finished" podID="45f0f708-2377-4d6f-915f-ae04b3ba4e4b" containerID="4fcb52715584ea601a4c6e6e85aa9cd21161e9f1ad6131e9979c0bf8199acb26" exitCode=0 Jun 13 04:53:09 crc kubenswrapper[4894]: I0613 04:53:09.651229 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5kgb" event={"ID":"45f0f708-2377-4d6f-915f-ae04b3ba4e4b","Type":"ContainerDied","Data":"4fcb52715584ea601a4c6e6e85aa9cd21161e9f1ad6131e9979c0bf8199acb26"} Jun 13 04:53:09 crc kubenswrapper[4894]: I0613 04:53:09.651526 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5kgb" event={"ID":"45f0f708-2377-4d6f-915f-ae04b3ba4e4b","Type":"ContainerStarted","Data":"e6ad5458e66721260c2777b5fdb2558ae80dfdd9957fa53e427d41c744de1eb6"} Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.009575 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgrkm"] Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.010451 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.012391 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.019987 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-catalog-content\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.020095 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-utilities\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.020200 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swrx\" (UniqueName: \"kubernetes.io/projected/aa646541-7ac7-4734-a72b-f1ee54746c8b-kube-api-access-7swrx\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.060900 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgrkm"] Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.121717 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-utilities\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.121781 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7swrx\" (UniqueName: \"kubernetes.io/projected/aa646541-7ac7-4734-a72b-f1ee54746c8b-kube-api-access-7swrx\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.121841 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-catalog-content\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.122373 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-utilities\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.122509 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa646541-7ac7-4734-a72b-f1ee54746c8b-catalog-content\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.146835 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swrx\" (UniqueName: \"kubernetes.io/projected/aa646541-7ac7-4734-a72b-f1ee54746c8b-kube-api-access-7swrx\") pod \"certified-operators-kgrkm\" (UID: \"aa646541-7ac7-4734-a72b-f1ee54746c8b\") " pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.282010 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a88e73a7-e3d0-4015-9f77-748ea17f6e39" path="/var/lib/kubelet/pods/a88e73a7-e3d0-4015-9f77-748ea17f6e39/volumes" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.331588 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.611787 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xltkn"] Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.615419 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.617762 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.626439 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xltkn"] Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.633280 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-utilities\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.633391 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-catalog-content\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.633481 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnm9\" (UniqueName: \"kubernetes.io/projected/28d29738-add5-4eae-882b-341e47914202-kube-api-access-2lnm9\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.658543 4894 generic.go:334] "Generic (PLEG): container finished" podID="45f0f708-2377-4d6f-915f-ae04b3ba4e4b" containerID="a40f389385711a46d5e0185aaabcdc7c8af299bf74e7dc41b510963304fa3867" exitCode=0 Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.658619 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5kgb" event={"ID":"45f0f708-2377-4d6f-915f-ae04b3ba4e4b","Type":"ContainerDied","Data":"a40f389385711a46d5e0185aaabcdc7c8af299bf74e7dc41b510963304fa3867"} Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.734920 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-utilities\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.734955 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-catalog-content\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.735000 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lnm9\" (UniqueName: \"kubernetes.io/projected/28d29738-add5-4eae-882b-341e47914202-kube-api-access-2lnm9\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.735635 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-catalog-content\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.736849 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d29738-add5-4eae-882b-341e47914202-utilities\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.742012 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgrkm"] Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.758692 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lnm9\" (UniqueName: \"kubernetes.io/projected/28d29738-add5-4eae-882b-341e47914202-kube-api-access-2lnm9\") pod \"redhat-operators-xltkn\" (UID: \"28d29738-add5-4eae-882b-341e47914202\") " pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:10 crc kubenswrapper[4894]: I0613 04:53:10.928250 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.144677 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xltkn"] Jun 13 04:53:11 crc kubenswrapper[4894]: W0613 04:53:11.155607 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d29738_add5_4eae_882b_341e47914202.slice/crio-dfe12b507e4c329c4b0b2a0c95432a9414f315691f2f44f3a195095958c77dfa WatchSource:0}: Error finding container dfe12b507e4c329c4b0b2a0c95432a9414f315691f2f44f3a195095958c77dfa: Status 404 returned error can't find the container with id dfe12b507e4c329c4b0b2a0c95432a9414f315691f2f44f3a195095958c77dfa Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.664533 4894 generic.go:334] "Generic (PLEG): container finished" podID="28d29738-add5-4eae-882b-341e47914202" containerID="711bc366a73f1c6751ea5892a00ccb657489ec64d398ab5e72ce373d531f26ac" exitCode=0 Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.664727 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xltkn" event={"ID":"28d29738-add5-4eae-882b-341e47914202","Type":"ContainerDied","Data":"711bc366a73f1c6751ea5892a00ccb657489ec64d398ab5e72ce373d531f26ac"} Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.665327 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xltkn" event={"ID":"28d29738-add5-4eae-882b-341e47914202","Type":"ContainerStarted","Data":"dfe12b507e4c329c4b0b2a0c95432a9414f315691f2f44f3a195095958c77dfa"} Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.666893 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5kgb" event={"ID":"45f0f708-2377-4d6f-915f-ae04b3ba4e4b","Type":"ContainerStarted","Data":"cc51b30dfd8b261f508d3c45826754c4fdb55792f89053850cced6fa80c7f349"} Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.670073 4894 generic.go:334] "Generic (PLEG): container finished" podID="aa646541-7ac7-4734-a72b-f1ee54746c8b" containerID="7a6f2828d2913267ac40fdc8caffa2ccd48142d7b37566e150b0173e0d21fc4c" exitCode=0 Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.670100 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgrkm" event={"ID":"aa646541-7ac7-4734-a72b-f1ee54746c8b","Type":"ContainerDied","Data":"7a6f2828d2913267ac40fdc8caffa2ccd48142d7b37566e150b0173e0d21fc4c"} Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.670116 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgrkm" event={"ID":"aa646541-7ac7-4734-a72b-f1ee54746c8b","Type":"ContainerStarted","Data":"3508ff3a69b407c15b48418d0bf756ed0ca904fc55785589de2f3b26140bcf3e"} Jun 13 04:53:11 crc kubenswrapper[4894]: I0613 04:53:11.699436 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b5kgb" podStartSLOduration=2.25434915 podStartE2EDuration="3.69942002s" podCreationTimestamp="2025-06-13 04:53:08 +0000 UTC" firstStartedPulling="2025-06-13 04:53:09.652938522 +0000 UTC m=+148.099185985" lastFinishedPulling="2025-06-13 04:53:11.098009392 +0000 UTC m=+149.544256855" observedRunningTime="2025-06-13 04:53:11.698556894 +0000 UTC m=+150.144804357" watchObservedRunningTime="2025-06-13 04:53:11.69942002 +0000 UTC m=+150.145667483" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.415699 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.417015 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.419020 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.426129 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.463269 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.463346 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvccg\" (UniqueName: \"kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.463368 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.564529 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.564618 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvccg\" (UniqueName: \"kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.564644 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.565067 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.565165 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.589589 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvccg\" (UniqueName: \"kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg\") pod \"community-operators-zf66l\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.676401 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xltkn" event={"ID":"28d29738-add5-4eae-882b-341e47914202","Type":"ContainerStarted","Data":"edc28698835649e22045974caf677b1e18773d8c20eb5e556dca8a672f57db67"} Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.678095 4894 generic.go:334] "Generic (PLEG): container finished" podID="aa646541-7ac7-4734-a72b-f1ee54746c8b" containerID="18a88249d97223c709577cc7d115dc1229f11dd40bfbc9169dba2c34a66f23c7" exitCode=0 Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.678720 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgrkm" event={"ID":"aa646541-7ac7-4734-a72b-f1ee54746c8b","Type":"ContainerDied","Data":"18a88249d97223c709577cc7d115dc1229f11dd40bfbc9169dba2c34a66f23c7"} Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.771878 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:12 crc kubenswrapper[4894]: I0613 04:53:12.936915 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 04:53:12 crc kubenswrapper[4894]: W0613 04:53:12.943352 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70994bbf_8b6d_476f_8125_191d4a08205e.slice/crio-02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846 WatchSource:0}: Error finding container 02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846: Status 404 returned error can't find the container with id 02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846 Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.683753 4894 generic.go:334] "Generic (PLEG): container finished" podID="28d29738-add5-4eae-882b-341e47914202" containerID="edc28698835649e22045974caf677b1e18773d8c20eb5e556dca8a672f57db67" exitCode=0 Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.683824 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xltkn" event={"ID":"28d29738-add5-4eae-882b-341e47914202","Type":"ContainerDied","Data":"edc28698835649e22045974caf677b1e18773d8c20eb5e556dca8a672f57db67"} Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.687272 4894 generic.go:334] "Generic (PLEG): container finished" podID="70994bbf-8b6d-476f-8125-191d4a08205e" containerID="38ad01742c7fbefdc6d3538cc0ca51c9c25dbc688fee254ed6295ac170e0a4f9" exitCode=0 Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.687342 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerDied","Data":"38ad01742c7fbefdc6d3538cc0ca51c9c25dbc688fee254ed6295ac170e0a4f9"} Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.687384 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerStarted","Data":"02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846"} Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.690517 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgrkm" event={"ID":"aa646541-7ac7-4734-a72b-f1ee54746c8b","Type":"ContainerStarted","Data":"310f128ef728eb15af879c048ff3a9753bee04627882e319f4cd8da3c30d5875"} Jun 13 04:53:13 crc kubenswrapper[4894]: I0613 04:53:13.728612 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgrkm" podStartSLOduration=3.265714773 podStartE2EDuration="4.728593433s" podCreationTimestamp="2025-06-13 04:53:09 +0000 UTC" firstStartedPulling="2025-06-13 04:53:11.671199821 +0000 UTC m=+150.117447284" lastFinishedPulling="2025-06-13 04:53:13.134078481 +0000 UTC m=+151.580325944" observedRunningTime="2025-06-13 04:53:13.726989495 +0000 UTC m=+152.173236958" watchObservedRunningTime="2025-06-13 04:53:13.728593433 +0000 UTC m=+152.174840886" Jun 13 04:53:14 crc kubenswrapper[4894]: I0613 04:53:14.698508 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xltkn" event={"ID":"28d29738-add5-4eae-882b-341e47914202","Type":"ContainerStarted","Data":"2c9dfc0910f379e1d318ab66a869066208c0ad48df9fe34ccd6f7c338a123f5e"} Jun 13 04:53:14 crc kubenswrapper[4894]: I0613 04:53:14.700527 4894 generic.go:334] "Generic (PLEG): container finished" podID="70994bbf-8b6d-476f-8125-191d4a08205e" containerID="35086ac5d287e8d043392ae98b89e6ea03178239a86d2099fa3ad49e4a677c88" exitCode=0 Jun 13 04:53:14 crc kubenswrapper[4894]: I0613 04:53:14.700636 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerDied","Data":"35086ac5d287e8d043392ae98b89e6ea03178239a86d2099fa3ad49e4a677c88"} Jun 13 04:53:14 crc kubenswrapper[4894]: I0613 04:53:14.721268 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xltkn" podStartSLOduration=2.263968111 podStartE2EDuration="4.721250528s" podCreationTimestamp="2025-06-13 04:53:10 +0000 UTC" firstStartedPulling="2025-06-13 04:53:11.66750006 +0000 UTC m=+150.113747513" lastFinishedPulling="2025-06-13 04:53:14.124782467 +0000 UTC m=+152.571029930" observedRunningTime="2025-06-13 04:53:14.720058572 +0000 UTC m=+153.166306045" watchObservedRunningTime="2025-06-13 04:53:14.721250528 +0000 UTC m=+153.167497991" Jun 13 04:53:16 crc kubenswrapper[4894]: I0613 04:53:16.713610 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerStarted","Data":"43f4ed7693fb32c304ea9933edbe378d969271942642f73649450fd25df3408d"} Jun 13 04:53:16 crc kubenswrapper[4894]: I0613 04:53:16.733497 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zf66l" podStartSLOduration=3.2790992 podStartE2EDuration="4.733471478s" podCreationTimestamp="2025-06-13 04:53:12 +0000 UTC" firstStartedPulling="2025-06-13 04:53:13.688935803 +0000 UTC m=+152.135183266" lastFinishedPulling="2025-06-13 04:53:15.143308081 +0000 UTC m=+153.589555544" observedRunningTime="2025-06-13 04:53:16.731268282 +0000 UTC m=+155.177515745" watchObservedRunningTime="2025-06-13 04:53:16.733471478 +0000 UTC m=+155.179718941" Jun 13 04:53:18 crc kubenswrapper[4894]: I0613 04:53:18.530574 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:18 crc kubenswrapper[4894]: I0613 04:53:18.532107 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:18 crc kubenswrapper[4894]: I0613 04:53:18.608518 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:18 crc kubenswrapper[4894]: I0613 04:53:18.763844 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b5kgb" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.333100 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.333172 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.393009 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.772364 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgrkm" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.928918 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.928976 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:20 crc kubenswrapper[4894]: I0613 04:53:20.972082 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:21 crc kubenswrapper[4894]: I0613 04:53:21.795060 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xltkn" Jun 13 04:53:22 crc kubenswrapper[4894]: I0613 04:53:22.772262 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:22 crc kubenswrapper[4894]: I0613 04:53:22.773805 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:22 crc kubenswrapper[4894]: I0613 04:53:22.825373 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:23 crc kubenswrapper[4894]: I0613 04:53:23.811371 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zf66l" Jun 13 04:53:26 crc kubenswrapper[4894]: I0613 04:53:26.236371 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:53:26 crc kubenswrapper[4894]: I0613 04:53:26.236698 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:53:56 crc kubenswrapper[4894]: I0613 04:53:56.236448 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:53:56 crc kubenswrapper[4894]: I0613 04:53:56.237826 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:54:26 crc kubenswrapper[4894]: I0613 04:54:26.236580 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:54:26 crc kubenswrapper[4894]: I0613 04:54:26.237299 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:54:26 crc kubenswrapper[4894]: I0613 04:54:26.237392 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:54:26 crc kubenswrapper[4894]: I0613 04:54:26.238319 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 04:54:26 crc kubenswrapper[4894]: I0613 04:54:26.238448 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01" gracePeriod=600 Jun 13 04:54:27 crc kubenswrapper[4894]: I0613 04:54:27.172893 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01" exitCode=0 Jun 13 04:54:27 crc kubenswrapper[4894]: I0613 04:54:27.173004 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01"} Jun 13 04:54:27 crc kubenswrapper[4894]: I0613 04:54:27.173486 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9"} Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.060867 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrpmt"] Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.062193 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.086032 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrpmt"] Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201312 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1038ae2-09b2-4cae-8007-0cac17959b66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201374 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-bound-sa-token\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201428 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-trusted-ca\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201463 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-tls\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201491 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzzz\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-kube-api-access-mwzzz\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201672 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-certificates\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201928 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1038ae2-09b2-4cae-8007-0cac17959b66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.201995 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.227144 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303391 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1038ae2-09b2-4cae-8007-0cac17959b66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303434 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-bound-sa-token\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303485 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-trusted-ca\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303511 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-tls\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303528 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzzz\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-kube-api-access-mwzzz\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303545 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-certificates\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.303569 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1038ae2-09b2-4cae-8007-0cac17959b66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.304014 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1038ae2-09b2-4cae-8007-0cac17959b66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.306164 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-trusted-ca\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.306334 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-certificates\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.311427 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-registry-tls\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.312568 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1038ae2-09b2-4cae-8007-0cac17959b66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.323867 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzzz\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-kube-api-access-mwzzz\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.332411 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1038ae2-09b2-4cae-8007-0cac17959b66-bound-sa-token\") pod \"image-registry-66df7c8f76-wrpmt\" (UID: \"e1038ae2-09b2-4cae-8007-0cac17959b66\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.381287 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.604074 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrpmt"] Jun 13 04:55:32 crc kubenswrapper[4894]: I0613 04:55:32.623802 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" event={"ID":"e1038ae2-09b2-4cae-8007-0cac17959b66","Type":"ContainerStarted","Data":"4200369537c256ccebeead90a92379f0701c55749fb8e4de09bcc233fc5922fc"} Jun 13 04:55:33 crc kubenswrapper[4894]: I0613 04:55:33.634969 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" event={"ID":"e1038ae2-09b2-4cae-8007-0cac17959b66","Type":"ContainerStarted","Data":"d3ecbeb90cf00c909583877cfa8e70a119e52563773792d61a5d4d42b3b90123"} Jun 13 04:55:33 crc kubenswrapper[4894]: I0613 04:55:33.636878 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:52 crc kubenswrapper[4894]: I0613 04:55:52.392076 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" Jun 13 04:55:52 crc kubenswrapper[4894]: I0613 04:55:52.466912 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wrpmt" podStartSLOduration=20.466888538 podStartE2EDuration="20.466888538s" podCreationTimestamp="2025-06-13 04:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:55:33.671466342 +0000 UTC m=+292.117713845" watchObservedRunningTime="2025-06-13 04:55:52.466888538 +0000 UTC m=+310.913136031" Jun 13 04:55:52 crc kubenswrapper[4894]: I0613 04:55:52.494943 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.552076 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" podUID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" containerName="registry" containerID="cri-o://9586a0b1f3d3ab149f69997076b6fa5eff18460f30c2d608923e600903e396b9" gracePeriod=30 Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.962547 4894 generic.go:334] "Generic (PLEG): container finished" podID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" containerID="9586a0b1f3d3ab149f69997076b6fa5eff18460f30c2d608923e600903e396b9" exitCode=0 Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.962628 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" event={"ID":"e7074322-a56f-4380-bf71-2ae9d44e9bc8","Type":"ContainerDied","Data":"9586a0b1f3d3ab149f69997076b6fa5eff18460f30c2d608923e600903e396b9"} Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.963634 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" event={"ID":"e7074322-a56f-4380-bf71-2ae9d44e9bc8","Type":"ContainerDied","Data":"a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1"} Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.963658 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18350401258aa4cda654ee298302f6ae9752d1fa27528bb9ab556825b952fb1" Jun 13 04:56:17 crc kubenswrapper[4894]: I0613 04:56:17.972085 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075316 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075444 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075488 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk5gw\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075773 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075863 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075908 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075961 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.075994 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca\") pod \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\" (UID: \"e7074322-a56f-4380-bf71-2ae9d44e9bc8\") " Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.077151 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.077420 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.080257 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw" (OuterVolumeSpecName: "kube-api-access-mk5gw") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "kube-api-access-mk5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.080684 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.080748 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.082191 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.091246 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.097048 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e7074322-a56f-4380-bf71-2ae9d44e9bc8" (UID: "e7074322-a56f-4380-bf71-2ae9d44e9bc8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177479 4894 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-tls\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177537 4894 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7074322-a56f-4380-bf71-2ae9d44e9bc8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177559 4894 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7074322-a56f-4380-bf71-2ae9d44e9bc8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177582 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-trusted-ca\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177603 4894 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177628 4894 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7074322-a56f-4380-bf71-2ae9d44e9bc8-registry-certificates\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.177661 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk5gw\" (UniqueName: \"kubernetes.io/projected/e7074322-a56f-4380-bf71-2ae9d44e9bc8-kube-api-access-mk5gw\") on node \"crc\" DevicePath \"\"" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.970506 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggqkw" Jun 13 04:56:18 crc kubenswrapper[4894]: I0613 04:56:18.996116 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:56:19 crc kubenswrapper[4894]: I0613 04:56:19.009877 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggqkw"] Jun 13 04:56:20 crc kubenswrapper[4894]: I0613 04:56:20.289262 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" path="/var/lib/kubelet/pods/e7074322-a56f-4380-bf71-2ae9d44e9bc8/volumes" Jun 13 04:56:26 crc kubenswrapper[4894]: I0613 04:56:26.237413 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:56:26 crc kubenswrapper[4894]: I0613 04:56:26.238009 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:56:56 crc kubenswrapper[4894]: I0613 04:56:56.236834 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:56:56 crc kubenswrapper[4894]: I0613 04:56:56.238124 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.236909 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.237715 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.237809 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.238951 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.239079 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9" gracePeriod=600 Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.426096 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9" exitCode=0 Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.426183 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9"} Jun 13 04:57:26 crc kubenswrapper[4894]: I0613 04:57:26.426239 4894 scope.go:117] "RemoveContainer" containerID="1554a208bca33401480b054d9c27ff3058ae8faf183c53aa538b38fb6bd3bf01" Jun 13 04:57:27 crc kubenswrapper[4894]: I0613 04:57:27.437265 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa"} Jun 13 04:57:42 crc kubenswrapper[4894]: I0613 04:57:42.477226 4894 scope.go:117] "RemoveContainer" containerID="9586a0b1f3d3ab149f69997076b6fa5eff18460f30c2d608923e600903e396b9" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.796285 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["default/crc-debug-bvp9k"] Jun 13 04:58:01 crc kubenswrapper[4894]: E0613 04:58:01.797303 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" containerName="registry" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.797325 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" containerName="registry" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.797532 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7074322-a56f-4380-bf71-2ae9d44e9bc8" containerName="registry" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.798145 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.801645 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"default"/"kube-root-ca.crt" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.802740 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"default"/"openshift-service-ca.crt" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.804426 4894 reflector.go:368] Caches populated for *v1.Secret from object-"default"/"default-dockercfg-xqf5w" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.833376 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z4xm\" (UniqueName: \"kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.833553 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.934909 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.935057 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z4xm\" (UniqueName: \"kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.935167 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:01 crc kubenswrapper[4894]: I0613 04:58:01.971124 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z4xm\" (UniqueName: \"kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm\") pod \"crc-debug-bvp9k\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " pod="default/crc-debug-bvp9k" Jun 13 04:58:02 crc kubenswrapper[4894]: I0613 04:58:02.129283 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="default/crc-debug-bvp9k" Jun 13 04:58:02 crc kubenswrapper[4894]: I0613 04:58:02.164367 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 04:58:02 crc kubenswrapper[4894]: I0613 04:58:02.681476 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="default/crc-debug-bvp9k" event={"ID":"89157b56-93d3-4c68-a94e-13dc661b4d77","Type":"ContainerStarted","Data":"c41fdd313885a0e06ec9b9163665c16971fd6f7f5c40c32c2848cc4c84c5e37e"} Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.833357 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wj9fc"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.835000 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.841352 4894 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-fq9jr" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.841472 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.846865 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.852962 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vzsr6"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.853806 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vzsr6" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.858603 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wj9fc"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.861218 4894 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rpxfj" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.874583 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnkm\" (UniqueName: \"kubernetes.io/projected/d817e341-4899-4993-b77d-827f73433f02-kube-api-access-hcnkm\") pod \"cert-manager-cainjector-7f985d654d-wj9fc\" (UID: \"d817e341-4899-4993-b77d-827f73433f02\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.874746 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r72g\" (UniqueName: \"kubernetes.io/projected/e2bb6864-165e-42ff-b86e-19129ead9f47-kube-api-access-7r72g\") pod \"cert-manager-5b446d88c5-vzsr6\" (UID: \"e2bb6864-165e-42ff-b86e-19129ead9f47\") " pod="cert-manager/cert-manager-5b446d88c5-vzsr6" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.878976 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qh9kw"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.879626 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.881239 4894 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mdsvv" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.893483 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vzsr6"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.904311 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qh9kw"] Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.975551 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r72g\" (UniqueName: \"kubernetes.io/projected/e2bb6864-165e-42ff-b86e-19129ead9f47-kube-api-access-7r72g\") pod \"cert-manager-5b446d88c5-vzsr6\" (UID: \"e2bb6864-165e-42ff-b86e-19129ead9f47\") " pod="cert-manager/cert-manager-5b446d88c5-vzsr6" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.975613 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnkm\" (UniqueName: \"kubernetes.io/projected/d817e341-4899-4993-b77d-827f73433f02-kube-api-access-hcnkm\") pod \"cert-manager-cainjector-7f985d654d-wj9fc\" (UID: \"d817e341-4899-4993-b77d-827f73433f02\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" Jun 13 04:58:10 crc kubenswrapper[4894]: I0613 04:58:10.975645 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq65m\" (UniqueName: \"kubernetes.io/projected/8722649b-9056-433e-a648-fb9e76f9e2e1-kube-api-access-vq65m\") pod \"cert-manager-webhook-5655c58dd6-qh9kw\" (UID: \"8722649b-9056-433e-a648-fb9e76f9e2e1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.003824 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r72g\" (UniqueName: \"kubernetes.io/projected/e2bb6864-165e-42ff-b86e-19129ead9f47-kube-api-access-7r72g\") pod \"cert-manager-5b446d88c5-vzsr6\" (UID: \"e2bb6864-165e-42ff-b86e-19129ead9f47\") " pod="cert-manager/cert-manager-5b446d88c5-vzsr6" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.003839 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnkm\" (UniqueName: \"kubernetes.io/projected/d817e341-4899-4993-b77d-827f73433f02-kube-api-access-hcnkm\") pod \"cert-manager-cainjector-7f985d654d-wj9fc\" (UID: \"d817e341-4899-4993-b77d-827f73433f02\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.076594 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq65m\" (UniqueName: \"kubernetes.io/projected/8722649b-9056-433e-a648-fb9e76f9e2e1-kube-api-access-vq65m\") pod \"cert-manager-webhook-5655c58dd6-qh9kw\" (UID: \"8722649b-9056-433e-a648-fb9e76f9e2e1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.105547 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq65m\" (UniqueName: \"kubernetes.io/projected/8722649b-9056-433e-a648-fb9e76f9e2e1-kube-api-access-vq65m\") pod \"cert-manager-webhook-5655c58dd6-qh9kw\" (UID: \"8722649b-9056-433e-a648-fb9e76f9e2e1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.156272 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.168747 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vzsr6" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.203935 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.369893 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vzsr6"] Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.402061 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wj9fc"] Jun 13 04:58:11 crc kubenswrapper[4894]: W0613 04:58:11.414560 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd817e341_4899_4993_b77d_827f73433f02.slice/crio-ffc242d8dabb732813518457738234c72460516a640f5ad38d3a555c93f5e189 WatchSource:0}: Error finding container ffc242d8dabb732813518457738234c72460516a640f5ad38d3a555c93f5e189: Status 404 returned error can't find the container with id ffc242d8dabb732813518457738234c72460516a640f5ad38d3a555c93f5e189 Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.453644 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qh9kw"] Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.746224 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vzsr6" event={"ID":"e2bb6864-165e-42ff-b86e-19129ead9f47","Type":"ContainerStarted","Data":"b439bf42e6c34932616244d7d25187bd507253b207d57353e619c18700e62eb6"} Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.747867 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="default/crc-debug-bvp9k" event={"ID":"89157b56-93d3-4c68-a94e-13dc661b4d77","Type":"ContainerStarted","Data":"e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307"} Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.750244 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" event={"ID":"d817e341-4899-4993-b77d-827f73433f02","Type":"ContainerStarted","Data":"ffc242d8dabb732813518457738234c72460516a640f5ad38d3a555c93f5e189"} Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.750964 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" event={"ID":"8722649b-9056-433e-a648-fb9e76f9e2e1","Type":"ContainerStarted","Data":"e053dc3a3619c762ea47d4d42ebb15ed4ca5167842f91d2da3d2104b70e652aa"} Jun 13 04:58:11 crc kubenswrapper[4894]: I0613 04:58:11.769606 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/crc-debug-bvp9k" podStartSLOduration=2.029303948 podStartE2EDuration="10.769577471s" podCreationTimestamp="2025-06-13 04:58:01 +0000 UTC" firstStartedPulling="2025-06-13 04:58:02.163988306 +0000 UTC m=+440.610235809" lastFinishedPulling="2025-06-13 04:58:10.904261869 +0000 UTC m=+449.350509332" observedRunningTime="2025-06-13 04:58:11.768882232 +0000 UTC m=+450.215129735" watchObservedRunningTime="2025-06-13 04:58:11.769577471 +0000 UTC m=+450.215825024" Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.787292 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" event={"ID":"8722649b-9056-433e-a648-fb9e76f9e2e1","Type":"ContainerStarted","Data":"93c3b045041fd1ffce0e3e1348db1e30796cc5fd25c98c72edea5ebb0aea9965"} Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.788966 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.792880 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vzsr6" event={"ID":"e2bb6864-165e-42ff-b86e-19129ead9f47","Type":"ContainerStarted","Data":"9fcdb2a457552ddabf56d0a241053acf71cc8c6045148f7f2bb29c7b7c0e3593"} Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.795574 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" event={"ID":"d817e341-4899-4993-b77d-827f73433f02","Type":"ContainerStarted","Data":"4133560eac2e2ea61fab47756f6a10b49e0a2aa43f2f1da63425512a0114401a"} Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.816791 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" podStartSLOduration=2.37080511 podStartE2EDuration="6.816759284s" podCreationTimestamp="2025-06-13 04:58:10 +0000 UTC" firstStartedPulling="2025-06-13 04:58:11.467730917 +0000 UTC m=+449.913978370" lastFinishedPulling="2025-06-13 04:58:15.913685081 +0000 UTC m=+454.359932544" observedRunningTime="2025-06-13 04:58:16.809557104 +0000 UTC m=+455.255804567" watchObservedRunningTime="2025-06-13 04:58:16.816759284 +0000 UTC m=+455.263006777" Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.839003 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wj9fc" podStartSLOduration=2.33985795 podStartE2EDuration="6.838975513s" podCreationTimestamp="2025-06-13 04:58:10 +0000 UTC" firstStartedPulling="2025-06-13 04:58:11.417020966 +0000 UTC m=+449.863268419" lastFinishedPulling="2025-06-13 04:58:15.916138519 +0000 UTC m=+454.362385982" observedRunningTime="2025-06-13 04:58:16.834677323 +0000 UTC m=+455.280924796" watchObservedRunningTime="2025-06-13 04:58:16.838975513 +0000 UTC m=+455.285223006" Jun 13 04:58:16 crc kubenswrapper[4894]: I0613 04:58:16.860153 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vzsr6" podStartSLOduration=2.340877057 podStartE2EDuration="6.860123211s" podCreationTimestamp="2025-06-13 04:58:10 +0000 UTC" firstStartedPulling="2025-06-13 04:58:11.387475643 +0000 UTC m=+449.833723106" lastFinishedPulling="2025-06-13 04:58:15.906721767 +0000 UTC m=+454.352969260" observedRunningTime="2025-06-13 04:58:16.8568331 +0000 UTC m=+455.303080603" watchObservedRunningTime="2025-06-13 04:58:16.860123211 +0000 UTC m=+455.306370714" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.203805 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["default/crc-debug-bvp9k"] Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.204521 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="default/crc-debug-bvp9k" podUID="89157b56-93d3-4c68-a94e-13dc661b4d77" containerName="container-00" containerID="cri-o://e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307" gracePeriod=2 Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.208378 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-qh9kw" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.208514 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["default/crc-debug-bvp9k"] Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.268360 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="default/crc-debug-bvp9k" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.371129 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z4xm\" (UniqueName: \"kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm\") pod \"89157b56-93d3-4c68-a94e-13dc661b4d77\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.371726 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host\") pod \"89157b56-93d3-4c68-a94e-13dc661b4d77\" (UID: \"89157b56-93d3-4c68-a94e-13dc661b4d77\") " Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.372063 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host" (OuterVolumeSpecName: "host") pod "89157b56-93d3-4c68-a94e-13dc661b4d77" (UID: "89157b56-93d3-4c68-a94e-13dc661b4d77"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.372480 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89157b56-93d3-4c68-a94e-13dc661b4d77-host\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.379022 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm" (OuterVolumeSpecName: "kube-api-access-5z4xm") pod "89157b56-93d3-4c68-a94e-13dc661b4d77" (UID: "89157b56-93d3-4c68-a94e-13dc661b4d77"). InnerVolumeSpecName "kube-api-access-5z4xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.473572 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z4xm\" (UniqueName: \"kubernetes.io/projected/89157b56-93d3-4c68-a94e-13dc661b4d77-kube-api-access-5z4xm\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.827902 4894 generic.go:334] "Generic (PLEG): container finished" podID="89157b56-93d3-4c68-a94e-13dc661b4d77" containerID="e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307" exitCode=0 Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.827956 4894 scope.go:117] "RemoveContainer" containerID="e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.828015 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="default/crc-debug-bvp9k" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.853913 4894 scope.go:117] "RemoveContainer" containerID="e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307" Jun 13 04:58:21 crc kubenswrapper[4894]: E0613 04:58:21.859235 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307\": container with ID starting with e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307 not found: ID does not exist" containerID="e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307" Jun 13 04:58:21 crc kubenswrapper[4894]: I0613 04:58:21.859319 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307"} err="failed to get container status \"e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307\": rpc error: code = NotFound desc = could not find container \"e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307\": container with ID starting with e2a524515e823c5171b626375fa4fe2ccf26ab3d723ca0fd44ec79c2937de307 not found: ID does not exist" Jun 13 04:58:22 crc kubenswrapper[4894]: I0613 04:58:22.288318 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89157b56-93d3-4c68-a94e-13dc661b4d77" path="/var/lib/kubelet/pods/89157b56-93d3-4c68-a94e-13dc661b4d77/volumes" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.061816 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8ss9"] Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.064915 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-controller" containerID="cri-o://3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065137 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="sbdb" containerID="cri-o://e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065011 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-node" containerID="cri-o://338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065046 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065098 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="nbdb" containerID="cri-o://990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065322 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="northd" containerID="cri-o://3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.065068 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-acl-logging" containerID="cri-o://e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.130577 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovnkube-controller" containerID="cri-o://850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" gracePeriod=30 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.793623 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8ss9_74d44566-8b68-4321-aec1-c8f73ead6c7c/ovn-acl-logging/0.log" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.794496 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8ss9_74d44566-8b68-4321-aec1-c8f73ead6c7c/ovn-controller/0.log" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.795254 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870234 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6xlv6"] Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870579 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870607 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870624 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-ovn-metrics" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870637 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-ovn-metrics" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870696 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89157b56-93d3-4c68-a94e-13dc661b4d77" containerName="container-00" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870710 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="89157b56-93d3-4c68-a94e-13dc661b4d77" containerName="container-00" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870729 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="nbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870743 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="nbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870758 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kubecfg-setup" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870771 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kubecfg-setup" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870784 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-acl-logging" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870797 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-acl-logging" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870818 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="northd" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870830 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="northd" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870849 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovnkube-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870862 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovnkube-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870883 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-node" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870895 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-node" Jun 13 04:58:44 crc kubenswrapper[4894]: E0613 04:58:44.870910 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="sbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.870922 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="sbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871095 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovnkube-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871113 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-acl-logging" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871128 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="ovn-controller" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871145 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-node" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871163 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="nbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871179 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="sbdb" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871195 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="89157b56-93d3-4c68-a94e-13dc661b4d77" containerName="container-00" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871210 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="northd" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.871227 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerName="kube-rbac-proxy-ovn-metrics" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.874277 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952307 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952388 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952439 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8m4h\" (UniqueName: \"kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952480 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952522 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952530 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket" (OuterVolumeSpecName: "log-socket") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952579 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952592 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952609 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952691 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952691 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952752 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952782 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952807 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952749 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash" (OuterVolumeSpecName: "host-slash") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952855 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952880 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952834 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952915 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952902 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log" (OuterVolumeSpecName: "node-log") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952942 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952979 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.952984 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953224 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953291 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953364 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953381 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953407 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953412 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953420 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953438 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953456 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953470 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953513 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib\") pod \"74d44566-8b68-4321-aec1-c8f73ead6c7c\" (UID: \"74d44566-8b68-4321-aec1-c8f73ead6c7c\") " Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953473 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953506 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953683 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-var-lib-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953720 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-ovn\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953764 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953813 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-env-overrides\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953871 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-config\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953950 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-bin\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.953987 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-etc-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954021 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzc5t\" (UniqueName: \"kubernetes.io/projected/80c629f7-4758-42b5-b413-41db11832b8d-kube-api-access-vzc5t\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954054 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954092 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-node-log\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954133 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-netd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954148 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954345 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-slash\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954476 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-script-lib\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954553 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954627 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-kubelet\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954725 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-systemd-units\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954811 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-log-socket\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954876 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-systemd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954927 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-netns\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.954982 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80c629f7-4758-42b5-b413-41db11832b8d-ovn-node-metrics-cert\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955324 4894 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955354 4894 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955373 4894 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955388 4894 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-ovn\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955401 4894 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955412 4894 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955425 4894 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-log-socket\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955440 4894 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955453 4894 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-kubelet\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955467 4894 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955481 4894 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-slash\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955491 4894 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74d44566-8b68-4321-aec1-c8f73ead6c7c-env-overrides\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955503 4894 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-node-log\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955514 4894 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955524 4894 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-host-run-netns\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955535 4894 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-systemd-units\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.955546 4894 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.960137 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.960193 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h" (OuterVolumeSpecName: "kube-api-access-x8m4h") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "kube-api-access-x8m4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.969164 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "74d44566-8b68-4321-aec1-c8f73ead6c7c" (UID: "74d44566-8b68-4321-aec1-c8f73ead6c7c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.986065 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8ss9_74d44566-8b68-4321-aec1-c8f73ead6c7c/ovn-acl-logging/0.log" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.986695 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8ss9_74d44566-8b68-4321-aec1-c8f73ead6c7c/ovn-controller/0.log" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987179 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987213 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987224 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987234 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987243 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987253 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" exitCode=0 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987261 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" exitCode=143 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987252 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987316 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987333 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987348 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987364 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987271 4894 generic.go:334] "Generic (PLEG): container finished" podID="74d44566-8b68-4321-aec1-c8f73ead6c7c" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" exitCode=143 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987380 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987394 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987408 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987417 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987428 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987440 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987449 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987457 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987465 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987472 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987479 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987486 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987493 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987499 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987509 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987519 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987527 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987535 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987543 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987550 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987558 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987567 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987576 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987587 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987599 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" event={"ID":"74d44566-8b68-4321-aec1-c8f73ead6c7c","Type":"ContainerDied","Data":"e63133071753a8d8f4423b52ad1092600ebf49c45da1cd198013a8f9d5d1eada"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987456 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987613 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987679 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987689 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987701 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987709 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987717 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987726 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987734 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.987742 4894 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.989990 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnlj9_b06b223d-8b15-48b3-ab96-5cf1b76fbcbd/kube-multus/0.log" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.990031 4894 generic.go:334] "Generic (PLEG): container finished" podID="b06b223d-8b15-48b3-ab96-5cf1b76fbcbd" containerID="566e4df392011bdd259e91c4add736fd3ea48b8ea29ced0871cd3f8bd459d669" exitCode=2 Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.990059 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnlj9" event={"ID":"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd","Type":"ContainerDied","Data":"566e4df392011bdd259e91c4add736fd3ea48b8ea29ced0871cd3f8bd459d669"} Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.990629 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8ss9" Jun 13 04:58:44 crc kubenswrapper[4894]: I0613 04:58:44.990707 4894 scope.go:117] "RemoveContainer" containerID="566e4df392011bdd259e91c4add736fd3ea48b8ea29ced0871cd3f8bd459d669" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.027026 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.061621 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-systemd-units\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062045 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-log-socket\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062078 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-systemd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062121 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-netns\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062149 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80c629f7-4758-42b5-b413-41db11832b8d-ovn-node-metrics-cert\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062171 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-var-lib-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062213 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-ovn\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062238 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062282 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-env-overrides\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062307 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-config\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062327 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-bin\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062369 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-etc-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062391 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzc5t\" (UniqueName: \"kubernetes.io/projected/80c629f7-4758-42b5-b413-41db11832b8d-kube-api-access-vzc5t\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062413 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.062484 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-systemd-units\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063410 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-ovn\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063544 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-log-socket\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063568 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-systemd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063585 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-netns\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063682 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8ss9"] Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063770 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-etc-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.063814 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-var-lib-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.064094 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.064731 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-config\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065072 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-bin\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065196 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065303 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-env-overrides\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065410 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-node-log\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065479 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-node-log\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065599 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-netd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065778 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-slash\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065952 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066067 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-script-lib\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066222 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-kubelet\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066386 4894 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74d44566-8b68-4321-aec1-c8f73ead6c7c-run-systemd\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066461 4894 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74d44566-8b68-4321-aec1-c8f73ead6c7c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066533 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8m4h\" (UniqueName: \"kubernetes.io/projected/74d44566-8b68-4321-aec1-c8f73ead6c7c-kube-api-access-x8m4h\") on node \"crc\" DevicePath \"\"" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066637 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-kubelet\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065769 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-cni-netd\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.066879 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-run-openvswitch\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.065812 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80c629f7-4758-42b5-b413-41db11832b8d-host-slash\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.076166 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.077127 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80c629f7-4758-42b5-b413-41db11832b8d-ovnkube-script-lib\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.079298 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80c629f7-4758-42b5-b413-41db11832b8d-ovn-node-metrics-cert\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.086355 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8ss9"] Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.095630 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzc5t\" (UniqueName: \"kubernetes.io/projected/80c629f7-4758-42b5-b413-41db11832b8d-kube-api-access-vzc5t\") pod \"ovnkube-node-6xlv6\" (UID: \"80c629f7-4758-42b5-b413-41db11832b8d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.095671 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.113659 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.128015 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.141604 4894 scope.go:117] "RemoveContainer" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.158599 4894 scope.go:117] "RemoveContainer" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.173131 4894 scope.go:117] "RemoveContainer" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.198606 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.199267 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.199324 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} err="failed to get container status \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.199350 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.199789 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.199835 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} err="failed to get container status \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.199850 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.200600 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.204117 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.204323 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} err="failed to get container status \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.204344 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.205606 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.205644 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} err="failed to get container status \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.205694 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.206173 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.206218 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} err="failed to get container status \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.206252 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.206574 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.206601 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} err="failed to get container status \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.206619 4894 scope.go:117] "RemoveContainer" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.208314 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": container with ID starting with e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c not found: ID does not exist" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.208344 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} err="failed to get container status \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": rpc error: code = NotFound desc = could not find container \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": container with ID starting with e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.208366 4894 scope.go:117] "RemoveContainer" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.210048 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": container with ID starting with 3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226 not found: ID does not exist" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.210076 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} err="failed to get container status \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": rpc error: code = NotFound desc = could not find container \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": container with ID starting with 3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.210096 4894 scope.go:117] "RemoveContainer" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: E0613 04:58:45.211123 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": container with ID starting with 35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991 not found: ID does not exist" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.211179 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} err="failed to get container status \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": rpc error: code = NotFound desc = could not find container \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": container with ID starting with 35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.211200 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.212143 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} err="failed to get container status \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.212209 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.213455 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} err="failed to get container status \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.213487 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.214514 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} err="failed to get container status \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.214543 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.214822 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} err="failed to get container status \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.214862 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215081 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} err="failed to get container status \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215096 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215392 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} err="failed to get container status \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215413 4894 scope.go:117] "RemoveContainer" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215627 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} err="failed to get container status \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": rpc error: code = NotFound desc = could not find container \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": container with ID starting with e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215679 4894 scope.go:117] "RemoveContainer" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215952 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} err="failed to get container status \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": rpc error: code = NotFound desc = could not find container \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": container with ID starting with 3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.215969 4894 scope.go:117] "RemoveContainer" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.216328 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} err="failed to get container status \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": rpc error: code = NotFound desc = could not find container \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": container with ID starting with 35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.216349 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.216727 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} err="failed to get container status \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.216753 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.216988 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} err="failed to get container status \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.217005 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.217242 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} err="failed to get container status \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.217265 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.217530 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} err="failed to get container status \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.217568 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.218792 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} err="failed to get container status \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.218817 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.219190 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} err="failed to get container status \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.219212 4894 scope.go:117] "RemoveContainer" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.219555 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} err="failed to get container status \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": rpc error: code = NotFound desc = could not find container \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": container with ID starting with e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.219780 4894 scope.go:117] "RemoveContainer" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220168 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} err="failed to get container status \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": rpc error: code = NotFound desc = could not find container \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": container with ID starting with 3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220205 4894 scope.go:117] "RemoveContainer" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220500 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} err="failed to get container status \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": rpc error: code = NotFound desc = could not find container \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": container with ID starting with 35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220520 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220901 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} err="failed to get container status \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.220922 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221162 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} err="failed to get container status \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221186 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221431 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} err="failed to get container status \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221459 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221715 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} err="failed to get container status \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.221737 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222053 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} err="failed to get container status \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222073 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222322 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} err="failed to get container status \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222345 4894 scope.go:117] "RemoveContainer" containerID="e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222621 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c"} err="failed to get container status \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": rpc error: code = NotFound desc = could not find container \"e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c\": container with ID starting with e03c68223cf4ebf96fab7632cf813989494a6bcf502d7a6ff34acb2ede134b9c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.222890 4894 scope.go:117] "RemoveContainer" containerID="3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.223231 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226"} err="failed to get container status \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": rpc error: code = NotFound desc = could not find container \"3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226\": container with ID starting with 3db346036cbfbf9983f03aca5e5ee6c330a16af055af76087c9485d34feb2226 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.223255 4894 scope.go:117] "RemoveContainer" containerID="35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.223946 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991"} err="failed to get container status \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": rpc error: code = NotFound desc = could not find container \"35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991\": container with ID starting with 35ae15be9d8acf353c1abaabb78870b3a18bc79496c8135ebb6ce21a4c8f6991 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.223976 4894 scope.go:117] "RemoveContainer" containerID="850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.227249 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294"} err="failed to get container status \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": rpc error: code = NotFound desc = could not find container \"850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294\": container with ID starting with 850be338780fa15e940cfa2b4a3e4c41938a651dde9cfe11f1246bb9ad64c294 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.227276 4894 scope.go:117] "RemoveContainer" containerID="e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.227700 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14"} err="failed to get container status \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": rpc error: code = NotFound desc = could not find container \"e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14\": container with ID starting with e3b1f39bda1e66c8eb1b74e6d9f5042eb70e4f977f14d11722f0998e4d584a14 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.227724 4894 scope.go:117] "RemoveContainer" containerID="990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228041 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34"} err="failed to get container status \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": rpc error: code = NotFound desc = could not find container \"990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34\": container with ID starting with 990e1300d449f997996b2f703923bf074094b60b21c268348aadfe92aeb44a34 not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228065 4894 scope.go:117] "RemoveContainer" containerID="3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228466 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe"} err="failed to get container status \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": rpc error: code = NotFound desc = could not find container \"3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe\": container with ID starting with 3826667cf56127359bd3942190562b9e585a956b4f5e13fa9b69133a8c910afe not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228494 4894 scope.go:117] "RemoveContainer" containerID="9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228954 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c"} err="failed to get container status \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": rpc error: code = NotFound desc = could not find container \"9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c\": container with ID starting with 9eda102c001016a47a3bfe5564b4acf4e8cfb57a021926397326183cc499cc2c not found: ID does not exist" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.228984 4894 scope.go:117] "RemoveContainer" containerID="338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e" Jun 13 04:58:45 crc kubenswrapper[4894]: I0613 04:58:45.231716 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e"} err="failed to get container status \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": rpc error: code = NotFound desc = could not find container \"338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e\": container with ID starting with 338a3446bed91d2298ab431e0455867a8729b065dddaf15c0b3656d1effc978e not found: ID does not exist" Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.000142 4894 generic.go:334] "Generic (PLEG): container finished" podID="80c629f7-4758-42b5-b413-41db11832b8d" containerID="12d9e20a24397dc5e87f15d83482c61e93bdcbaf506c832d64d371fa61a8d9c6" exitCode=0 Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.000248 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerDied","Data":"12d9e20a24397dc5e87f15d83482c61e93bdcbaf506c832d64d371fa61a8d9c6"} Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.000629 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"02fe661b1c39cc846dcf893131f9614a3dd693daed51e6badf2650982d1c92f0"} Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.005989 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnlj9_b06b223d-8b15-48b3-ab96-5cf1b76fbcbd/kube-multus/0.log" Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.006080 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnlj9" event={"ID":"b06b223d-8b15-48b3-ab96-5cf1b76fbcbd","Type":"ContainerStarted","Data":"a1473271550acb90aa3eebd1f8fd2271e3f201e81ebdad7d27d62c8198b8fbe4"} Jun 13 04:58:46 crc kubenswrapper[4894]: I0613 04:58:46.287585 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d44566-8b68-4321-aec1-c8f73ead6c7c" path="/var/lib/kubelet/pods/74d44566-8b68-4321-aec1-c8f73ead6c7c/volumes" Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018011 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"3c684b67cdf0bba7c85536e7f69f0614be0409de450dc8324ad05935d42aff41"} Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018322 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"a99d01f302d62d39301ebccc70c19a47e3bd6e7d4cbf9dafb12445e3f502d11a"} Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018336 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"889fe671a461b5187b34275c3b1732fbc49c1f318e2fb46980bc5f9b853432dd"} Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018350 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"20e2e9eb8fa32779e52fc54c730675c773d81a2acc03d381cd6b82b36e751fdf"} Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018362 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"6c9662d2efcf8b1d082e2a05f94cb6269f21aa230b42dbc34a2e55fa0a0f5491"} Jun 13 04:58:47 crc kubenswrapper[4894]: I0613 04:58:47.018375 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"cc301a77bb2b2e59e643e011b1e0c1d583a4c963e11f7b3b593deee5fd6ec769"} Jun 13 04:58:50 crc kubenswrapper[4894]: I0613 04:58:50.042214 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"9414bf238f3a8ae16d7793ea80a453c4a946f82be3bf4f2a9229d2c5f59bf7f6"} Jun 13 04:58:52 crc kubenswrapper[4894]: I0613 04:58:52.116212 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" event={"ID":"80c629f7-4758-42b5-b413-41db11832b8d","Type":"ContainerStarted","Data":"17d6cddd5eb9df82804ff44731d366f4d6984f8a7e2eadcf00e6d744c1df43fe"} Jun 13 04:58:52 crc kubenswrapper[4894]: I0613 04:58:52.116734 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:52 crc kubenswrapper[4894]: I0613 04:58:52.153261 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:52 crc kubenswrapper[4894]: I0613 04:58:52.161213 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" podStartSLOduration=8.161194512 podStartE2EDuration="8.161194512s" podCreationTimestamp="2025-06-13 04:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:58:52.158752194 +0000 UTC m=+490.604999657" watchObservedRunningTime="2025-06-13 04:58:52.161194512 +0000 UTC m=+490.607441975" Jun 13 04:58:53 crc kubenswrapper[4894]: I0613 04:58:53.125002 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:53 crc kubenswrapper[4894]: I0613 04:58:53.125044 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:58:53 crc kubenswrapper[4894]: I0613 04:58:53.164091 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.609474 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-mslxt"] Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.611080 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.614114 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.614201 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.619530 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.711615 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.711786 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdb4k\" (UniqueName: \"kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.813017 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdb4k\" (UniqueName: \"kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.813357 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.813545 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.846026 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdb4k\" (UniqueName: \"kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k\") pod \"crc-debug-mslxt\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: I0613 04:59:01.931947 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mslxt" Jun 13 04:59:01 crc kubenswrapper[4894]: W0613 04:59:01.994010 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98bef62e_0876_488e_8009_2142cfcf0deb.slice/crio-6fe7d6f48f7fea63678bbd3160e2c08f6f55178c883e90b4ed806b1200bce981 WatchSource:0}: Error finding container 6fe7d6f48f7fea63678bbd3160e2c08f6f55178c883e90b4ed806b1200bce981: Status 404 returned error can't find the container with id 6fe7d6f48f7fea63678bbd3160e2c08f6f55178c883e90b4ed806b1200bce981 Jun 13 04:59:02 crc kubenswrapper[4894]: I0613 04:59:02.178572 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mslxt" event={"ID":"98bef62e-0876-488e-8009-2142cfcf0deb","Type":"ContainerStarted","Data":"19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956"} Jun 13 04:59:02 crc kubenswrapper[4894]: I0613 04:59:02.178799 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mslxt" event={"ID":"98bef62e-0876-488e-8009-2142cfcf0deb","Type":"ContainerStarted","Data":"6fe7d6f48f7fea63678bbd3160e2c08f6f55178c883e90b4ed806b1200bce981"} Jun 13 04:59:02 crc kubenswrapper[4894]: I0613 04:59:02.194700 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-mslxt" podStartSLOduration=1.194682821 podStartE2EDuration="1.194682821s" podCreationTimestamp="2025-06-13 04:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:59:02.192993623 +0000 UTC m=+500.639241086" watchObservedRunningTime="2025-06-13 04:59:02.194682821 +0000 UTC m=+500.640930284" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.099646 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x"] Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.103041 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.107447 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.110951 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x"] Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.229360 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgk6\" (UniqueName: \"kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.229429 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.229516 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.332296 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgk6\" (UniqueName: \"kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.332877 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.333120 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.333425 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.333834 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.355069 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgk6\" (UniqueName: \"kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6\") pod \"cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.455475 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:03 crc kubenswrapper[4894]: I0613 04:59:03.957545 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x"] Jun 13 04:59:03 crc kubenswrapper[4894]: W0613 04:59:03.971900 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b945725_0767_451f_9574_d95782ced9c9.slice/crio-6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c WatchSource:0}: Error finding container 6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c: Status 404 returned error can't find the container with id 6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c Jun 13 04:59:04 crc kubenswrapper[4894]: I0613 04:59:04.193924 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerStarted","Data":"3822de3990cb294b714743b397a73c837bf31bbb67be909ae223f4f3a9e52013"} Jun 13 04:59:04 crc kubenswrapper[4894]: I0613 04:59:04.194395 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerStarted","Data":"6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c"} Jun 13 04:59:05 crc kubenswrapper[4894]: I0613 04:59:05.202596 4894 generic.go:334] "Generic (PLEG): container finished" podID="4b945725-0767-451f-9574-d95782ced9c9" containerID="3822de3990cb294b714743b397a73c837bf31bbb67be909ae223f4f3a9e52013" exitCode=0 Jun 13 04:59:05 crc kubenswrapper[4894]: I0613 04:59:05.202687 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerDied","Data":"3822de3990cb294b714743b397a73c837bf31bbb67be909ae223f4f3a9e52013"} Jun 13 04:59:07 crc kubenswrapper[4894]: I0613 04:59:07.221192 4894 generic.go:334] "Generic (PLEG): container finished" podID="4b945725-0767-451f-9574-d95782ced9c9" containerID="4080cb629b3e348d7f35868a08740a697f9179afa103ee2b8ac045b0f875e5a5" exitCode=0 Jun 13 04:59:07 crc kubenswrapper[4894]: I0613 04:59:07.221623 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerDied","Data":"4080cb629b3e348d7f35868a08740a697f9179afa103ee2b8ac045b0f875e5a5"} Jun 13 04:59:08 crc kubenswrapper[4894]: I0613 04:59:08.232094 4894 generic.go:334] "Generic (PLEG): container finished" podID="4b945725-0767-451f-9574-d95782ced9c9" containerID="9531257cbeece404c4b9a2f1282640d1fe3dbe94540e0307199fb0282cdfbb19" exitCode=0 Jun 13 04:59:08 crc kubenswrapper[4894]: I0613 04:59:08.232183 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerDied","Data":"9531257cbeece404c4b9a2f1282640d1fe3dbe94540e0307199fb0282cdfbb19"} Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.552263 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.625317 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util\") pod \"4b945725-0767-451f-9574-d95782ced9c9\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.625416 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwgk6\" (UniqueName: \"kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6\") pod \"4b945725-0767-451f-9574-d95782ced9c9\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.625493 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle\") pod \"4b945725-0767-451f-9574-d95782ced9c9\" (UID: \"4b945725-0767-451f-9574-d95782ced9c9\") " Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.626638 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle" (OuterVolumeSpecName: "bundle") pod "4b945725-0767-451f-9574-d95782ced9c9" (UID: "4b945725-0767-451f-9574-d95782ced9c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.633163 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6" (OuterVolumeSpecName: "kube-api-access-zwgk6") pod "4b945725-0767-451f-9574-d95782ced9c9" (UID: "4b945725-0767-451f-9574-d95782ced9c9"). InnerVolumeSpecName "kube-api-access-zwgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.642366 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util" (OuterVolumeSpecName: "util") pod "4b945725-0767-451f-9574-d95782ced9c9" (UID: "4b945725-0767-451f-9574-d95782ced9c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.726477 4894 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.726533 4894 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b945725-0767-451f-9574-d95782ced9c9-util\") on node \"crc\" DevicePath \"\"" Jun 13 04:59:09 crc kubenswrapper[4894]: I0613 04:59:09.726545 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwgk6\" (UniqueName: \"kubernetes.io/projected/4b945725-0767-451f-9574-d95782ced9c9-kube-api-access-zwgk6\") on node \"crc\" DevicePath \"\"" Jun 13 04:59:10 crc kubenswrapper[4894]: I0613 04:59:10.256495 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" event={"ID":"4b945725-0767-451f-9574-d95782ced9c9","Type":"ContainerDied","Data":"6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c"} Jun 13 04:59:10 crc kubenswrapper[4894]: I0613 04:59:10.256549 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6caacda7efb7e8a029e9c425cfb9b80704b6c0327809ea82ae24c6bbbf94894c" Jun 13 04:59:10 crc kubenswrapper[4894]: I0613 04:59:10.256642 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x" Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.418794 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-mslxt"] Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.419174 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-mslxt" podUID="98bef62e-0876-488e-8009-2142cfcf0deb" containerName="container-00" containerID="cri-o://19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956" gracePeriod=2 Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.423975 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-mslxt"] Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.487942 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mslxt" Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.672014 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdb4k\" (UniqueName: \"kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k\") pod \"98bef62e-0876-488e-8009-2142cfcf0deb\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.672447 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host\") pod \"98bef62e-0876-488e-8009-2142cfcf0deb\" (UID: \"98bef62e-0876-488e-8009-2142cfcf0deb\") " Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.672526 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host" (OuterVolumeSpecName: "host") pod "98bef62e-0876-488e-8009-2142cfcf0deb" (UID: "98bef62e-0876-488e-8009-2142cfcf0deb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.673036 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98bef62e-0876-488e-8009-2142cfcf0deb-host\") on node \"crc\" DevicePath \"\"" Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.679719 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k" (OuterVolumeSpecName: "kube-api-access-wdb4k") pod "98bef62e-0876-488e-8009-2142cfcf0deb" (UID: "98bef62e-0876-488e-8009-2142cfcf0deb"). InnerVolumeSpecName "kube-api-access-wdb4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 04:59:12 crc kubenswrapper[4894]: I0613 04:59:12.774005 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdb4k\" (UniqueName: \"kubernetes.io/projected/98bef62e-0876-488e-8009-2142cfcf0deb-kube-api-access-wdb4k\") on node \"crc\" DevicePath \"\"" Jun 13 04:59:13 crc kubenswrapper[4894]: I0613 04:59:13.279730 4894 generic.go:334] "Generic (PLEG): container finished" podID="98bef62e-0876-488e-8009-2142cfcf0deb" containerID="19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956" exitCode=0 Jun 13 04:59:13 crc kubenswrapper[4894]: I0613 04:59:13.279888 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mslxt" Jun 13 04:59:13 crc kubenswrapper[4894]: I0613 04:59:13.279915 4894 scope.go:117] "RemoveContainer" containerID="19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956" Jun 13 04:59:13 crc kubenswrapper[4894]: I0613 04:59:13.309384 4894 scope.go:117] "RemoveContainer" containerID="19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956" Jun 13 04:59:13 crc kubenswrapper[4894]: E0613 04:59:13.310603 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956\": container with ID starting with 19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956 not found: ID does not exist" containerID="19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956" Jun 13 04:59:13 crc kubenswrapper[4894]: I0613 04:59:13.310719 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956"} err="failed to get container status \"19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956\": rpc error: code = NotFound desc = could not find container \"19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956\": container with ID starting with 19542f23edcfb6d9f273b710ac015cd8f1b69d53db8a3acf443cf5a00e0b8956 not found: ID does not exist" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.287958 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bef62e-0876-488e-8009-2142cfcf0deb" path="/var/lib/kubelet/pods/98bef62e-0876-488e-8009-2142cfcf0deb/volumes" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.719854 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4"] Jun 13 04:59:14 crc kubenswrapper[4894]: E0613 04:59:14.720075 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bef62e-0876-488e-8009-2142cfcf0deb" containerName="container-00" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720089 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bef62e-0876-488e-8009-2142cfcf0deb" containerName="container-00" Jun 13 04:59:14 crc kubenswrapper[4894]: E0613 04:59:14.720107 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="util" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720115 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="util" Jun 13 04:59:14 crc kubenswrapper[4894]: E0613 04:59:14.720129 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="extract" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720138 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="extract" Jun 13 04:59:14 crc kubenswrapper[4894]: E0613 04:59:14.720151 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="pull" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720158 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="pull" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720273 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b945725-0767-451f-9574-d95782ced9c9" containerName="extract" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720289 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bef62e-0876-488e-8009-2142cfcf0deb" containerName="container-00" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.720737 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.722568 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.722724 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-smw5w" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.735093 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4"] Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.736753 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jun 13 04:59:14 crc kubenswrapper[4894]: I0613 04:59:14.901053 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnssw\" (UniqueName: \"kubernetes.io/projected/e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0-kube-api-access-nnssw\") pod \"nmstate-operator-5d8f945fdc-jnpn4\" (UID: \"e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0\") " pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.002478 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnssw\" (UniqueName: \"kubernetes.io/projected/e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0-kube-api-access-nnssw\") pod \"nmstate-operator-5d8f945fdc-jnpn4\" (UID: \"e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0\") " pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.030806 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnssw\" (UniqueName: \"kubernetes.io/projected/e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0-kube-api-access-nnssw\") pod \"nmstate-operator-5d8f945fdc-jnpn4\" (UID: \"e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0\") " pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.036851 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.224778 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xlv6" Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.267387 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4"] Jun 13 04:59:15 crc kubenswrapper[4894]: W0613 04:59:15.276418 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ea5c7c_a747_4a1e_82ba_8b81e3c0a4c0.slice/crio-6ef7055c1c1c63a199a9dd7999285e969dbb843baa8f2f459081779a021079a0 WatchSource:0}: Error finding container 6ef7055c1c1c63a199a9dd7999285e969dbb843baa8f2f459081779a021079a0: Status 404 returned error can't find the container with id 6ef7055c1c1c63a199a9dd7999285e969dbb843baa8f2f459081779a021079a0 Jun 13 04:59:15 crc kubenswrapper[4894]: I0613 04:59:15.306153 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" event={"ID":"e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0","Type":"ContainerStarted","Data":"6ef7055c1c1c63a199a9dd7999285e969dbb843baa8f2f459081779a021079a0"} Jun 13 04:59:18 crc kubenswrapper[4894]: I0613 04:59:18.323256 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" event={"ID":"e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0","Type":"ContainerStarted","Data":"711e4f872ca0b399ede3439b50b0fbe6b8ca7899b0b978ee26e35ed2e6934504"} Jun 13 04:59:18 crc kubenswrapper[4894]: I0613 04:59:18.355112 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d8f945fdc-jnpn4" podStartSLOduration=1.854722129 podStartE2EDuration="4.355086394s" podCreationTimestamp="2025-06-13 04:59:14 +0000 UTC" firstStartedPulling="2025-06-13 04:59:15.27966819 +0000 UTC m=+513.725915653" lastFinishedPulling="2025-06-13 04:59:17.780032445 +0000 UTC m=+516.226279918" observedRunningTime="2025-06-13 04:59:18.348233363 +0000 UTC m=+516.794480866" watchObservedRunningTime="2025-06-13 04:59:18.355086394 +0000 UTC m=+516.801333897" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.803689 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-748555f888-pnvfx"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.805419 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.813068 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-79bb5" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.817591 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.818618 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.822118 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pmtdt"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.822682 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.823892 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.828619 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9b7t\" (UniqueName: \"kubernetes.io/projected/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-kube-api-access-l9b7t\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.828701 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-tls-key-pair\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.841173 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-748555f888-pnvfx"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.850323 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930154 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-nmstate-lock\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930206 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-dbus-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930362 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-ovs-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930457 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9b7t\" (UniqueName: \"kubernetes.io/projected/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-kube-api-access-l9b7t\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930527 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8xh\" (UniqueName: \"kubernetes.io/projected/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-kube-api-access-4d8xh\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930558 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-tls-key-pair\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.930586 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wvz\" (UniqueName: \"kubernetes.io/projected/b08ebefd-4622-4b62-8b92-63c658947cb1-kube-api-access-m2wvz\") pod \"nmstate-metrics-748555f888-pnvfx\" (UID: \"b08ebefd-4622-4b62-8b92-63c658947cb1\") " pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.936707 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-tls-key-pair\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.948276 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9b7t\" (UniqueName: \"kubernetes.io/projected/d541bc7b-8bed-4996-b3e3-e851b13f6fc4-kube-api-access-l9b7t\") pod \"nmstate-webhook-79c49d6bf4-jztch\" (UID: \"d541bc7b-8bed-4996-b3e3-e851b13f6fc4\") " pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.967513 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz"] Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.968156 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.970815 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rl84t" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.970977 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.971883 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jun 13 04:59:23 crc kubenswrapper[4894]: I0613 04:59:23.995074 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031585 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52d99494-f908-4ff0-95b0-48261d144df9-nginx-conf\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031727 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdkvr\" (UniqueName: \"kubernetes.io/projected/52d99494-f908-4ff0-95b0-48261d144df9-kube-api-access-bdkvr\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031767 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8xh\" (UniqueName: \"kubernetes.io/projected/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-kube-api-access-4d8xh\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031801 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wvz\" (UniqueName: \"kubernetes.io/projected/b08ebefd-4622-4b62-8b92-63c658947cb1-kube-api-access-m2wvz\") pod \"nmstate-metrics-748555f888-pnvfx\" (UID: \"b08ebefd-4622-4b62-8b92-63c658947cb1\") " pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031854 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-nmstate-lock\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031879 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-dbus-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031955 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-nmstate-lock\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.031896 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d99494-f908-4ff0-95b0-48261d144df9-plugin-serving-cert\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.032020 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-ovs-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.032096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-ovs-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.032358 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-dbus-socket\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.049551 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8xh\" (UniqueName: \"kubernetes.io/projected/fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8-kube-api-access-4d8xh\") pod \"nmstate-handler-pmtdt\" (UID: \"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8\") " pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.049950 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wvz\" (UniqueName: \"kubernetes.io/projected/b08ebefd-4622-4b62-8b92-63c658947cb1-kube-api-access-m2wvz\") pod \"nmstate-metrics-748555f888-pnvfx\" (UID: \"b08ebefd-4622-4b62-8b92-63c658947cb1\") " pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.129207 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.133025 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52d99494-f908-4ff0-95b0-48261d144df9-nginx-conf\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.133094 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdkvr\" (UniqueName: \"kubernetes.io/projected/52d99494-f908-4ff0-95b0-48261d144df9-kube-api-access-bdkvr\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.133152 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d99494-f908-4ff0-95b0-48261d144df9-plugin-serving-cert\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.133963 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52d99494-f908-4ff0-95b0-48261d144df9-nginx-conf\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.139234 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d99494-f908-4ff0-95b0-48261d144df9-plugin-serving-cert\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.139965 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.151928 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.162266 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdkvr\" (UniqueName: \"kubernetes.io/projected/52d99494-f908-4ff0-95b0-48261d144df9-kube-api-access-bdkvr\") pod \"nmstate-console-plugin-67b45cfc7d-k6jsz\" (UID: \"52d99494-f908-4ff0-95b0-48261d144df9\") " pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.232606 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c874c85b7-snjvh"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.242758 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.332821 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343193 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343460 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-oauth-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343480 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-oauth-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343500 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-trusted-ca-bundle\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343524 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-service-ca\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343543 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-console-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.343560 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zct6p\" (UniqueName: \"kubernetes.io/projected/5bf08182-585a-43e0-9e95-639053b1758c-kube-api-access-zct6p\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.423685 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c874c85b7-snjvh"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.434825 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pmtdt" event={"ID":"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8","Type":"ContainerStarted","Data":"9e6cdb42828f0fbcc0b3eea389a0ce740796cba6981d027d6dfde87c42edec93"} Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446115 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-service-ca\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446154 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-console-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446175 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zct6p\" (UniqueName: \"kubernetes.io/projected/5bf08182-585a-43e0-9e95-639053b1758c-kube-api-access-zct6p\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446279 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446330 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-oauth-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446356 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-oauth-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.446389 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-trusted-ca-bundle\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.447295 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-service-ca\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.447321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-trusted-ca-bundle\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.448905 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-console-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.452455 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.455992 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bf08182-585a-43e0-9e95-639053b1758c-console-oauth-config\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.458549 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bf08182-585a-43e0-9e95-639053b1758c-oauth-serving-cert\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.470594 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zct6p\" (UniqueName: \"kubernetes.io/projected/5bf08182-585a-43e0-9e95-639053b1758c-kube-api-access-zct6p\") pod \"console-7c874c85b7-snjvh\" (UID: \"5bf08182-585a-43e0-9e95-639053b1758c\") " pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.522931 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-748555f888-pnvfx"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.610165 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.638344 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.661030 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz"] Jun 13 04:59:24 crc kubenswrapper[4894]: I0613 04:59:24.816289 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c874c85b7-snjvh"] Jun 13 04:59:24 crc kubenswrapper[4894]: W0613 04:59:24.824936 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf08182_585a_43e0_9e95_639053b1758c.slice/crio-1f3aac28929f09bbc6a7b7fb3066e85590b40b963c042fc97e47c303711d0ec8 WatchSource:0}: Error finding container 1f3aac28929f09bbc6a7b7fb3066e85590b40b963c042fc97e47c303711d0ec8: Status 404 returned error can't find the container with id 1f3aac28929f09bbc6a7b7fb3066e85590b40b963c042fc97e47c303711d0ec8 Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.444446 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" event={"ID":"b08ebefd-4622-4b62-8b92-63c658947cb1","Type":"ContainerStarted","Data":"a20d4a94f650862eef5f0f70db0d27802f6061cfe4b9a964e8231f22bafeb3d6"} Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.445966 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" event={"ID":"d541bc7b-8bed-4996-b3e3-e851b13f6fc4","Type":"ContainerStarted","Data":"ec1e239f8735f6c14f5b5b77e8b2343ad5b804412f97f3b29ec77f0170b39d54"} Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.447271 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" event={"ID":"52d99494-f908-4ff0-95b0-48261d144df9","Type":"ContainerStarted","Data":"23557e3b69e766066f4526fa8820c681656c82fc666a806d5f5434d40575b581"} Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.449429 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c874c85b7-snjvh" event={"ID":"5bf08182-585a-43e0-9e95-639053b1758c","Type":"ContainerStarted","Data":"514c0c6b1b39c732f3630a05e1ce10a1c3ce334fd7428182ee7157a269182c6d"} Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.449509 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c874c85b7-snjvh" event={"ID":"5bf08182-585a-43e0-9e95-639053b1758c","Type":"ContainerStarted","Data":"1f3aac28929f09bbc6a7b7fb3066e85590b40b963c042fc97e47c303711d0ec8"} Jun 13 04:59:25 crc kubenswrapper[4894]: I0613 04:59:25.475687 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c874c85b7-snjvh" podStartSLOduration=1.4756739730000001 podStartE2EDuration="1.475673973s" podCreationTimestamp="2025-06-13 04:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 04:59:25.472896035 +0000 UTC m=+523.919143498" watchObservedRunningTime="2025-06-13 04:59:25.475673973 +0000 UTC m=+523.921921436" Jun 13 04:59:26 crc kubenswrapper[4894]: I0613 04:59:26.236115 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:59:26 crc kubenswrapper[4894]: I0613 04:59:26.236333 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.473499 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" event={"ID":"d541bc7b-8bed-4996-b3e3-e851b13f6fc4","Type":"ContainerStarted","Data":"340356c8e71a25bbdc67fc5244c3c431ba030005af589496113ba1861b3d7951"} Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.474226 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.480142 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" event={"ID":"52d99494-f908-4ff0-95b0-48261d144df9","Type":"ContainerStarted","Data":"5c2beb43ca2e712ac8cad5feb474a8c312a5aa05aa8d24ec6abef4cbaaeb7ee3"} Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.481767 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" event={"ID":"b08ebefd-4622-4b62-8b92-63c658947cb1","Type":"ContainerStarted","Data":"ab25ec4171b3006761a6a796d28bd3d1caf9e3e714f677f0b1b1bad05d10fb6a"} Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.483290 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pmtdt" event={"ID":"fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8","Type":"ContainerStarted","Data":"476b19e35e27897d0dea07898ee004b3db92caa86d6ffa58d7f84aa953e59973"} Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.483495 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.495638 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" podStartSLOduration=2.42888157 podStartE2EDuration="5.495611913s" podCreationTimestamp="2025-06-13 04:59:23 +0000 UTC" firstStartedPulling="2025-06-13 04:59:24.618019754 +0000 UTC m=+523.064267217" lastFinishedPulling="2025-06-13 04:59:27.684750057 +0000 UTC m=+526.130997560" observedRunningTime="2025-06-13 04:59:28.489711988 +0000 UTC m=+526.935959491" watchObservedRunningTime="2025-06-13 04:59:28.495611913 +0000 UTC m=+526.941859416" Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.518491 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-67b45cfc7d-k6jsz" podStartSLOduration=2.503908269 podStartE2EDuration="5.518466989s" podCreationTimestamp="2025-06-13 04:59:23 +0000 UTC" firstStartedPulling="2025-06-13 04:59:24.667941914 +0000 UTC m=+523.114189377" lastFinishedPulling="2025-06-13 04:59:27.682500594 +0000 UTC m=+526.128748097" observedRunningTime="2025-06-13 04:59:28.513282415 +0000 UTC m=+526.959529888" watchObservedRunningTime="2025-06-13 04:59:28.518466989 +0000 UTC m=+526.964714462" Jun 13 04:59:28 crc kubenswrapper[4894]: I0613 04:59:28.543583 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pmtdt" podStartSLOduration=2.073352832 podStartE2EDuration="5.543562978s" podCreationTimestamp="2025-06-13 04:59:23 +0000 UTC" firstStartedPulling="2025-06-13 04:59:24.216612418 +0000 UTC m=+522.662859881" lastFinishedPulling="2025-06-13 04:59:27.686822534 +0000 UTC m=+526.133070027" observedRunningTime="2025-06-13 04:59:28.533882928 +0000 UTC m=+526.980130401" watchObservedRunningTime="2025-06-13 04:59:28.543562978 +0000 UTC m=+526.989810451" Jun 13 04:59:30 crc kubenswrapper[4894]: I0613 04:59:30.497854 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" event={"ID":"b08ebefd-4622-4b62-8b92-63c658947cb1","Type":"ContainerStarted","Data":"8efb69dc0fe91eb111fba92915d40e8cc991739de7fdb822c8f90e4ff4b47bff"} Jun 13 04:59:30 crc kubenswrapper[4894]: I0613 04:59:30.518969 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-748555f888-pnvfx" podStartSLOduration=2.049856987 podStartE2EDuration="7.518940285s" podCreationTimestamp="2025-06-13 04:59:23 +0000 UTC" firstStartedPulling="2025-06-13 04:59:24.537997836 +0000 UTC m=+522.984245289" lastFinishedPulling="2025-06-13 04:59:30.007081124 +0000 UTC m=+528.453328587" observedRunningTime="2025-06-13 04:59:30.517892056 +0000 UTC m=+528.964139559" watchObservedRunningTime="2025-06-13 04:59:30.518940285 +0000 UTC m=+528.965187778" Jun 13 04:59:34 crc kubenswrapper[4894]: I0613 04:59:34.188121 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pmtdt" Jun 13 04:59:34 crc kubenswrapper[4894]: I0613 04:59:34.639710 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:34 crc kubenswrapper[4894]: I0613 04:59:34.639879 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:34 crc kubenswrapper[4894]: I0613 04:59:34.649315 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:35 crc kubenswrapper[4894]: I0613 04:59:35.547782 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c874c85b7-snjvh" Jun 13 04:59:35 crc kubenswrapper[4894]: I0613 04:59:35.632996 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 04:59:44 crc kubenswrapper[4894]: I0613 04:59:44.149008 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-79c49d6bf4-jztch" Jun 13 04:59:56 crc kubenswrapper[4894]: I0613 04:59:56.236783 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 04:59:56 crc kubenswrapper[4894]: I0613 04:59:56.237393 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.141773 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp"] Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.147555 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.151694 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp"] Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.156270 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.158425 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.251760 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.252033 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbrl\" (UniqueName: \"kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.252103 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.352817 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.353057 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbrl\" (UniqueName: \"kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.353217 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.354180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.367309 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.370500 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbrl\" (UniqueName: \"kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl\") pod \"collect-profiles-29163180-6wgjp\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.477816 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.691598 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pllr7" podUID="08284aa4-ae65-47a7-940e-9f558505402a" containerName="console" containerID="cri-o://8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4" gracePeriod=15 Jun 13 05:00:00 crc kubenswrapper[4894]: I0613 05:00:00.898481 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp"] Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.092730 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pllr7_08284aa4-ae65-47a7-940e-9f558505402a/console/0.log" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.093041 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265338 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265475 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265539 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8srp\" (UniqueName: \"kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265570 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265605 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265644 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.265713 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config\") pod \"08284aa4-ae65-47a7-940e-9f558505402a\" (UID: \"08284aa4-ae65-47a7-940e-9f558505402a\") " Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.266203 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.266547 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca" (OuterVolumeSpecName: "service-ca") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.266926 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config" (OuterVolumeSpecName: "console-config") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.267335 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.271712 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.271823 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp" (OuterVolumeSpecName: "kube-api-access-r8srp") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "kube-api-access-r8srp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.272260 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "08284aa4-ae65-47a7-940e-9f558505402a" (UID: "08284aa4-ae65-47a7-940e-9f558505402a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.367705 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8srp\" (UniqueName: \"kubernetes.io/projected/08284aa4-ae65-47a7-940e-9f558505402a-kube-api-access-r8srp\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368204 4894 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368368 4894 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368505 4894 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08284aa4-ae65-47a7-940e-9f558505402a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368627 4894 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-console-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368798 4894 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.368936 4894 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08284aa4-ae65-47a7-940e-9f558505402a-service-ca\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768311 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pllr7_08284aa4-ae65-47a7-940e-9f558505402a/console/0.log" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768411 4894 generic.go:334] "Generic (PLEG): container finished" podID="08284aa4-ae65-47a7-940e-9f558505402a" containerID="8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4" exitCode=2 Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768518 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pllr7" event={"ID":"08284aa4-ae65-47a7-940e-9f558505402a","Type":"ContainerDied","Data":"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4"} Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768549 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pllr7" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768565 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pllr7" event={"ID":"08284aa4-ae65-47a7-940e-9f558505402a","Type":"ContainerDied","Data":"b06d98fc4a505197d17828bec2bccedf2e531edb3c46bcf395b3f25f419fafe7"} Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.768583 4894 scope.go:117] "RemoveContainer" containerID="8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.771831 4894 generic.go:334] "Generic (PLEG): container finished" podID="279dc827-6e3f-44e7-b36a-77e117eb9f07" containerID="85044fda6b7e1b33816bafd58cc20e23f66e54a0e9d71d73a5baff597d29a57d" exitCode=0 Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.771872 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" event={"ID":"279dc827-6e3f-44e7-b36a-77e117eb9f07","Type":"ContainerDied","Data":"85044fda6b7e1b33816bafd58cc20e23f66e54a0e9d71d73a5baff597d29a57d"} Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.771901 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" event={"ID":"279dc827-6e3f-44e7-b36a-77e117eb9f07","Type":"ContainerStarted","Data":"51ceca28bfefea231993af717e6d1c5bacd4ba77d4e9b7999c24e9ce78f11057"} Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.802490 4894 scope.go:117] "RemoveContainer" containerID="8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4" Jun 13 05:00:01 crc kubenswrapper[4894]: E0613 05:00:01.803173 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4\": container with ID starting with 8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4 not found: ID does not exist" containerID="8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.804231 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4"} err="failed to get container status \"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4\": rpc error: code = NotFound desc = could not find container \"8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4\": container with ID starting with 8fec7cfe98577fbe7c54526f11ff30f6adad5fd26bcd5a5010817c40e8c404e4 not found: ID does not exist" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.822890 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.829731 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pllr7"] Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.889783 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-6jxjq"] Jun 13 05:00:01 crc kubenswrapper[4894]: E0613 05:00:01.890116 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08284aa4-ae65-47a7-940e-9f558505402a" containerName="console" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.890144 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="08284aa4-ae65-47a7-940e-9f558505402a" containerName="console" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.890353 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="08284aa4-ae65-47a7-940e-9f558505402a" containerName="console" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.891032 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6jxjq" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.893938 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.896684 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:00:01 crc kubenswrapper[4894]: I0613 05:00:01.897423 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.079180 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.079807 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2tlj\" (UniqueName: \"kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.181460 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2tlj\" (UniqueName: \"kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.181731 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.181889 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.212096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2tlj\" (UniqueName: \"kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj\") pod \"crc-debug-6jxjq\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.288862 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08284aa4-ae65-47a7-940e-9f558505402a" path="/var/lib/kubelet/pods/08284aa4-ae65-47a7-940e-9f558505402a/volumes" Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.507999 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6jxjq" Jun 13 05:00:02 crc kubenswrapper[4894]: W0613 05:00:02.533836 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34a889f_ebd0_4385_8498_49974aa64642.slice/crio-85722760e1f33b3870e90f51794c314ed59a02031de74f9a4747dc5c661d68b7 WatchSource:0}: Error finding container 85722760e1f33b3870e90f51794c314ed59a02031de74f9a4747dc5c661d68b7: Status 404 returned error can't find the container with id 85722760e1f33b3870e90f51794c314ed59a02031de74f9a4747dc5c661d68b7 Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.779648 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-6jxjq" event={"ID":"d34a889f-ebd0-4385-8498-49974aa64642","Type":"ContainerStarted","Data":"9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35"} Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.779729 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-6jxjq" event={"ID":"d34a889f-ebd0-4385-8498-49974aa64642","Type":"ContainerStarted","Data":"85722760e1f33b3870e90f51794c314ed59a02031de74f9a4747dc5c661d68b7"} Jun 13 05:00:02 crc kubenswrapper[4894]: I0613 05:00:02.806477 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-6jxjq" podStartSLOduration=1.806400791 podStartE2EDuration="1.806400791s" podCreationTimestamp="2025-06-13 05:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:00:02.798164492 +0000 UTC m=+561.244411975" watchObservedRunningTime="2025-06-13 05:00:02.806400791 +0000 UTC m=+561.252648334" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.041236 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.194330 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume\") pod \"279dc827-6e3f-44e7-b36a-77e117eb9f07\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.194424 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume\") pod \"279dc827-6e3f-44e7-b36a-77e117eb9f07\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.194581 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxbrl\" (UniqueName: \"kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl\") pod \"279dc827-6e3f-44e7-b36a-77e117eb9f07\" (UID: \"279dc827-6e3f-44e7-b36a-77e117eb9f07\") " Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.195568 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume" (OuterVolumeSpecName: "config-volume") pod "279dc827-6e3f-44e7-b36a-77e117eb9f07" (UID: "279dc827-6e3f-44e7-b36a-77e117eb9f07"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.200122 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "279dc827-6e3f-44e7-b36a-77e117eb9f07" (UID: "279dc827-6e3f-44e7-b36a-77e117eb9f07"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.201550 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl" (OuterVolumeSpecName: "kube-api-access-xxbrl") pod "279dc827-6e3f-44e7-b36a-77e117eb9f07" (UID: "279dc827-6e3f-44e7-b36a-77e117eb9f07"). InnerVolumeSpecName "kube-api-access-xxbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.297213 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/279dc827-6e3f-44e7-b36a-77e117eb9f07-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.297304 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxbrl\" (UniqueName: \"kubernetes.io/projected/279dc827-6e3f-44e7-b36a-77e117eb9f07-kube-api-access-xxbrl\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.297346 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/279dc827-6e3f-44e7-b36a-77e117eb9f07-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.794882 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" event={"ID":"279dc827-6e3f-44e7-b36a-77e117eb9f07","Type":"ContainerDied","Data":"51ceca28bfefea231993af717e6d1c5bacd4ba77d4e9b7999c24e9ce78f11057"} Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.794941 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51ceca28bfefea231993af717e6d1c5bacd4ba77d4e9b7999c24e9ce78f11057" Jun 13 05:00:03 crc kubenswrapper[4894]: I0613 05:00:03.794970 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp" Jun 13 05:00:12 crc kubenswrapper[4894]: I0613 05:00:12.950426 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-6jxjq"] Jun 13 05:00:12 crc kubenswrapper[4894]: I0613 05:00:12.951320 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-6jxjq" podUID="d34a889f-ebd0-4385-8498-49974aa64642" containerName="container-00" containerID="cri-o://9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35" gracePeriod=2 Jun 13 05:00:12 crc kubenswrapper[4894]: I0613 05:00:12.957338 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-6jxjq"] Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.004894 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6jxjq" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.201382 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host\") pod \"d34a889f-ebd0-4385-8498-49974aa64642\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.201478 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2tlj\" (UniqueName: \"kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj\") pod \"d34a889f-ebd0-4385-8498-49974aa64642\" (UID: \"d34a889f-ebd0-4385-8498-49974aa64642\") " Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.201595 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host" (OuterVolumeSpecName: "host") pod "d34a889f-ebd0-4385-8498-49974aa64642" (UID: "d34a889f-ebd0-4385-8498-49974aa64642"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.201913 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d34a889f-ebd0-4385-8498-49974aa64642-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.220288 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj" (OuterVolumeSpecName: "kube-api-access-d2tlj") pod "d34a889f-ebd0-4385-8498-49974aa64642" (UID: "d34a889f-ebd0-4385-8498-49974aa64642"). InnerVolumeSpecName "kube-api-access-d2tlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.303609 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2tlj\" (UniqueName: \"kubernetes.io/projected/d34a889f-ebd0-4385-8498-49974aa64642-kube-api-access-d2tlj\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.857784 4894 generic.go:334] "Generic (PLEG): container finished" podID="d34a889f-ebd0-4385-8498-49974aa64642" containerID="9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35" exitCode=0 Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.857860 4894 scope.go:117] "RemoveContainer" containerID="9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.858141 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6jxjq" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.881176 4894 scope.go:117] "RemoveContainer" containerID="9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35" Jun 13 05:00:13 crc kubenswrapper[4894]: E0613 05:00:13.881673 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35\": container with ID starting with 9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35 not found: ID does not exist" containerID="9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35" Jun 13 05:00:13 crc kubenswrapper[4894]: I0613 05:00:13.881702 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35"} err="failed to get container status \"9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35\": rpc error: code = NotFound desc = could not find container \"9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35\": container with ID starting with 9cb90340749c2d3d90643af20f26d318c1fc03171f2f042f46c3816bc54dca35 not found: ID does not exist" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.274542 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz"] Jun 13 05:00:14 crc kubenswrapper[4894]: E0613 05:00:14.274993 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34a889f-ebd0-4385-8498-49974aa64642" containerName="container-00" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.275005 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34a889f-ebd0-4385-8498-49974aa64642" containerName="container-00" Jun 13 05:00:14 crc kubenswrapper[4894]: E0613 05:00:14.275017 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279dc827-6e3f-44e7-b36a-77e117eb9f07" containerName="collect-profiles" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.275022 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="279dc827-6e3f-44e7-b36a-77e117eb9f07" containerName="collect-profiles" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.275116 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34a889f-ebd0-4385-8498-49974aa64642" containerName="container-00" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.275128 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="279dc827-6e3f-44e7-b36a-77e117eb9f07" containerName="collect-profiles" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.275838 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.279256 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.284705 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34a889f-ebd0-4385-8498-49974aa64642" path="/var/lib/kubelet/pods/d34a889f-ebd0-4385-8498-49974aa64642/volumes" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.285098 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz"] Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.419017 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.419072 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.419103 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchth\" (UniqueName: \"kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.519885 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.519941 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.519991 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchth\" (UniqueName: \"kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.520431 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.520771 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.545042 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchth\" (UniqueName: \"kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth\") pod \"6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.606008 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.844284 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz"] Jun 13 05:00:14 crc kubenswrapper[4894]: W0613 05:00:14.851693 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50205d6d_0191_47eb_bf4a_1e9157be8634.slice/crio-99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4 WatchSource:0}: Error finding container 99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4: Status 404 returned error can't find the container with id 99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4 Jun 13 05:00:14 crc kubenswrapper[4894]: I0613 05:00:14.894081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" event={"ID":"50205d6d-0191-47eb-bf4a-1e9157be8634","Type":"ContainerStarted","Data":"99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4"} Jun 13 05:00:15 crc kubenswrapper[4894]: I0613 05:00:15.904016 4894 generic.go:334] "Generic (PLEG): container finished" podID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerID="6897a002101faaf00b1fe4af09c5feb731d532e28a0fab5ad83538f2534e6ff5" exitCode=0 Jun 13 05:00:15 crc kubenswrapper[4894]: I0613 05:00:15.904135 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" event={"ID":"50205d6d-0191-47eb-bf4a-1e9157be8634","Type":"ContainerDied","Data":"6897a002101faaf00b1fe4af09c5feb731d532e28a0fab5ad83538f2534e6ff5"} Jun 13 05:00:17 crc kubenswrapper[4894]: I0613 05:00:17.920087 4894 generic.go:334] "Generic (PLEG): container finished" podID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerID="41e1dbb27fef8c6cfcb9a645dd524d239eb13c749a7f9e22088b88b06d79fdc5" exitCode=0 Jun 13 05:00:17 crc kubenswrapper[4894]: I0613 05:00:17.920214 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" event={"ID":"50205d6d-0191-47eb-bf4a-1e9157be8634","Type":"ContainerDied","Data":"41e1dbb27fef8c6cfcb9a645dd524d239eb13c749a7f9e22088b88b06d79fdc5"} Jun 13 05:00:18 crc kubenswrapper[4894]: I0613 05:00:18.932880 4894 generic.go:334] "Generic (PLEG): container finished" podID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerID="4c8d820525c88f9936dda5a7521b6b3845aa26a9ec2015cbf61b60f721e0c0d3" exitCode=0 Jun 13 05:00:18 crc kubenswrapper[4894]: I0613 05:00:18.932941 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" event={"ID":"50205d6d-0191-47eb-bf4a-1e9157be8634","Type":"ContainerDied","Data":"4c8d820525c88f9936dda5a7521b6b3845aa26a9ec2015cbf61b60f721e0c0d3"} Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.286291 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.403628 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dchth\" (UniqueName: \"kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth\") pod \"50205d6d-0191-47eb-bf4a-1e9157be8634\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.403980 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util\") pod \"50205d6d-0191-47eb-bf4a-1e9157be8634\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.404020 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle\") pod \"50205d6d-0191-47eb-bf4a-1e9157be8634\" (UID: \"50205d6d-0191-47eb-bf4a-1e9157be8634\") " Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.405069 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle" (OuterVolumeSpecName: "bundle") pod "50205d6d-0191-47eb-bf4a-1e9157be8634" (UID: "50205d6d-0191-47eb-bf4a-1e9157be8634"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.418818 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth" (OuterVolumeSpecName: "kube-api-access-dchth") pod "50205d6d-0191-47eb-bf4a-1e9157be8634" (UID: "50205d6d-0191-47eb-bf4a-1e9157be8634"). InnerVolumeSpecName "kube-api-access-dchth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.420600 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util" (OuterVolumeSpecName: "util") pod "50205d6d-0191-47eb-bf4a-1e9157be8634" (UID: "50205d6d-0191-47eb-bf4a-1e9157be8634"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.505298 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dchth\" (UniqueName: \"kubernetes.io/projected/50205d6d-0191-47eb-bf4a-1e9157be8634-kube-api-access-dchth\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.505346 4894 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-util\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.505365 4894 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50205d6d-0191-47eb-bf4a-1e9157be8634-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.959200 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" event={"ID":"50205d6d-0191-47eb-bf4a-1e9157be8634","Type":"ContainerDied","Data":"99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4"} Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.959312 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99cfb0e93c83f9f9f95588d87fcc65eec0680c7dcaecf9f601acc648737c51a4" Jun 13 05:00:20 crc kubenswrapper[4894]: I0613 05:00:20.959330 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz" Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.236500 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.237178 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.237265 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.238162 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.238272 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa" gracePeriod=600 Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.997109 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa" exitCode=0 Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.997166 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa"} Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.998346 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe"} Jun 13 05:00:26 crc kubenswrapper[4894]: I0613 05:00:26.998386 4894 scope.go:117] "RemoveContainer" containerID="6aee468e30746420937d0220de95d3b6360a28a0959b084261acbc7fac54e3e9" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.341859 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8"] Jun 13 05:00:29 crc kubenswrapper[4894]: E0613 05:00:29.342581 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="pull" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.342596 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="pull" Jun 13 05:00:29 crc kubenswrapper[4894]: E0613 05:00:29.342614 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="util" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.342622 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="util" Jun 13 05:00:29 crc kubenswrapper[4894]: E0613 05:00:29.342637 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="extract" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.342644 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="extract" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.342889 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="50205d6d-0191-47eb-bf4a-1e9157be8634" containerName="extract" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.343506 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.356271 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.356533 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lccc6" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.356712 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.356905 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.357039 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.379982 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8"] Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.424914 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-webhook-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.425250 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdt7b\" (UniqueName: \"kubernetes.io/projected/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-kube-api-access-qdt7b\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.425376 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-apiservice-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.526338 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-webhook-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.526675 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdt7b\" (UniqueName: \"kubernetes.io/projected/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-kube-api-access-qdt7b\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.526768 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-apiservice-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.540489 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-webhook-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.545109 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-apiservice-cert\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.574550 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdt7b\" (UniqueName: \"kubernetes.io/projected/6b796cfd-5cf3-47be-ae8e-f3d77fc7917d-kube-api-access-qdt7b\") pod \"metallb-operator-controller-manager-856f595c5f-qqwj8\" (UID: \"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d\") " pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.669298 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.810117 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq"] Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.811023 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.826857 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.827144 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zllgq" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.828494 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.840271 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq"] Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.931822 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-webhook-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.931896 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxbm\" (UniqueName: \"kubernetes.io/projected/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-kube-api-access-nbxbm\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:29 crc kubenswrapper[4894]: I0613 05:00:29.931931 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-apiservice-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.033471 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-webhook-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.033532 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxbm\" (UniqueName: \"kubernetes.io/projected/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-kube-api-access-nbxbm\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.033563 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-apiservice-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.042314 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-apiservice-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.043676 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-webhook-cert\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.052545 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxbm\" (UniqueName: \"kubernetes.io/projected/d81de5ae-61d6-4f4c-b9b2-03dd880f3465-kube-api-access-nbxbm\") pod \"metallb-operator-webhook-server-d4cd7966b-4t7qq\" (UID: \"d81de5ae-61d6-4f4c-b9b2-03dd880f3465\") " pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.097948 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8"] Jun 13 05:00:30 crc kubenswrapper[4894]: W0613 05:00:30.102176 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b796cfd_5cf3_47be_ae8e_f3d77fc7917d.slice/crio-6523c10cf3aa7d1ebf5083aa1e8a956443e79af3fee46ce4c91575586c7192ca WatchSource:0}: Error finding container 6523c10cf3aa7d1ebf5083aa1e8a956443e79af3fee46ce4c91575586c7192ca: Status 404 returned error can't find the container with id 6523c10cf3aa7d1ebf5083aa1e8a956443e79af3fee46ce4c91575586c7192ca Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.127992 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:30 crc kubenswrapper[4894]: I0613 05:00:30.394481 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq"] Jun 13 05:00:31 crc kubenswrapper[4894]: I0613 05:00:31.026376 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" event={"ID":"d81de5ae-61d6-4f4c-b9b2-03dd880f3465","Type":"ContainerStarted","Data":"ec38dee6a0cfeda130bf0767cc81111bbdb66359c09d0c8142a2ab51017be746"} Jun 13 05:00:31 crc kubenswrapper[4894]: I0613 05:00:31.027876 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" event={"ID":"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d","Type":"ContainerStarted","Data":"6523c10cf3aa7d1ebf5083aa1e8a956443e79af3fee46ce4c91575586c7192ca"} Jun 13 05:00:35 crc kubenswrapper[4894]: I0613 05:00:35.057133 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" event={"ID":"6b796cfd-5cf3-47be-ae8e-f3d77fc7917d","Type":"ContainerStarted","Data":"736181bf16dc483645b2403ba10fe9e42ff629b2b4868e29377380737dc290ce"} Jun 13 05:00:35 crc kubenswrapper[4894]: I0613 05:00:35.057624 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:00:35 crc kubenswrapper[4894]: I0613 05:00:35.087772 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" podStartSLOduration=1.952735685 podStartE2EDuration="6.087748667s" podCreationTimestamp="2025-06-13 05:00:29 +0000 UTC" firstStartedPulling="2025-06-13 05:00:30.105834282 +0000 UTC m=+588.552081745" lastFinishedPulling="2025-06-13 05:00:34.240847264 +0000 UTC m=+592.687094727" observedRunningTime="2025-06-13 05:00:35.08502454 +0000 UTC m=+593.531272003" watchObservedRunningTime="2025-06-13 05:00:35.087748667 +0000 UTC m=+593.533996130" Jun 13 05:00:37 crc kubenswrapper[4894]: I0613 05:00:37.076093 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" event={"ID":"d81de5ae-61d6-4f4c-b9b2-03dd880f3465","Type":"ContainerStarted","Data":"ffd5258addf39f3b58afe4579befb192624140c6923db9a43e1415f19ccfaee7"} Jun 13 05:00:37 crc kubenswrapper[4894]: I0613 05:00:37.076787 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:00:37 crc kubenswrapper[4894]: I0613 05:00:37.107107 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" podStartSLOduration=2.430813878 podStartE2EDuration="8.10708973s" podCreationTimestamp="2025-06-13 05:00:29 +0000 UTC" firstStartedPulling="2025-06-13 05:00:30.419244193 +0000 UTC m=+588.865491656" lastFinishedPulling="2025-06-13 05:00:36.095520025 +0000 UTC m=+594.541767508" observedRunningTime="2025-06-13 05:00:37.10638778 +0000 UTC m=+595.552635243" watchObservedRunningTime="2025-06-13 05:00:37.10708973 +0000 UTC m=+595.553337193" Jun 13 05:00:50 crc kubenswrapper[4894]: I0613 05:00:50.137882 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d4cd7966b-4t7qq" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.474018 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-b2fgf"] Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.476016 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.479151 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.479469 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.489081 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.502940 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg2n4\" (UniqueName: \"kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.503125 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.604438 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.604539 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg2n4\" (UniqueName: \"kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.604758 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.640640 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg2n4\" (UniqueName: \"kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4\") pod \"crc-debug-b2fgf\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " pod="openstack/crc-debug-b2fgf" Jun 13 05:01:02 crc kubenswrapper[4894]: I0613 05:01:02.812181 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b2fgf" Jun 13 05:01:03 crc kubenswrapper[4894]: I0613 05:01:03.268880 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b2fgf" event={"ID":"0cb242c4-962c-4e89-8ccf-18e99818513e","Type":"ContainerStarted","Data":"65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246"} Jun 13 05:01:03 crc kubenswrapper[4894]: I0613 05:01:03.269154 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b2fgf" event={"ID":"0cb242c4-962c-4e89-8ccf-18e99818513e","Type":"ContainerStarted","Data":"b31f09370378c9b624f34ac2cc98b0c44751b8bc1a71af3ba39d7da9138dd689"} Jun 13 05:01:09 crc kubenswrapper[4894]: I0613 05:01:09.672758 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-856f595c5f-qqwj8" Jun 13 05:01:09 crc kubenswrapper[4894]: I0613 05:01:09.731179 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-b2fgf" podStartSLOduration=7.731145161 podStartE2EDuration="7.731145161s" podCreationTimestamp="2025-06-13 05:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:01:03.284262902 +0000 UTC m=+621.730510365" watchObservedRunningTime="2025-06-13 05:01:09.731145161 +0000 UTC m=+628.177392714" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.590979 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kwsg5"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.593005 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.598619 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.600927 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.602428 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.603136 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.606063 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vn4sk" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.606311 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.622085 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.684323 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5z464"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.685352 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.687766 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.687905 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.688322 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6zm9w" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.691232 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.699888 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5f968f88cc-wrmdl"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.700753 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.702891 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.712904 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713038 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt4r2\" (UniqueName: \"kubernetes.io/projected/ce477a54-7315-4d18-9fc3-af3bd9216888-kube-api-access-dt4r2\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713117 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-sockets\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713193 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics-certs\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713265 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-conf\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713339 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-startup\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713414 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-reloader\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713493 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tth\" (UniqueName: \"kubernetes.io/projected/3b112a53-8bb0-4587-9f69-debaf87494c9-kube-api-access-79tth\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.713562 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b112a53-8bb0-4587-9f69-debaf87494c9-cert\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.721540 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5f968f88cc-wrmdl"] Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.814577 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-cert\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.814831 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tth\" (UniqueName: \"kubernetes.io/projected/3b112a53-8bb0-4587-9f69-debaf87494c9-kube-api-access-79tth\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.814939 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b112a53-8bb0-4587-9f69-debaf87494c9-cert\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815034 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsvr\" (UniqueName: \"kubernetes.io/projected/8d6bd32d-109c-4f18-a8de-057098dae117-kube-api-access-7xsvr\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815125 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815207 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt4r2\" (UniqueName: \"kubernetes.io/projected/ce477a54-7315-4d18-9fc3-af3bd9216888-kube-api-access-dt4r2\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815294 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-sockets\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815375 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd2c4\" (UniqueName: \"kubernetes.io/projected/7138d1ac-8b33-4219-b8fd-303eae7e0334-kube-api-access-rd2c4\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815458 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815531 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-metrics-certs\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815611 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d6bd32d-109c-4f18-a8de-057098dae117-metallb-excludel2\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815681 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-sockets\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815541 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815703 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics-certs\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815870 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-conf\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.815952 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-metrics-certs\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.816056 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-startup\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.816158 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-reloader\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.816238 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-conf\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.816391 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ce477a54-7315-4d18-9fc3-af3bd9216888-reloader\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.816942 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ce477a54-7315-4d18-9fc3-af3bd9216888-frr-startup\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.822005 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b112a53-8bb0-4587-9f69-debaf87494c9-cert\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.828261 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce477a54-7315-4d18-9fc3-af3bd9216888-metrics-certs\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.845167 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt4r2\" (UniqueName: \"kubernetes.io/projected/ce477a54-7315-4d18-9fc3-af3bd9216888-kube-api-access-dt4r2\") pod \"frr-k8s-kwsg5\" (UID: \"ce477a54-7315-4d18-9fc3-af3bd9216888\") " pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.845212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tth\" (UniqueName: \"kubernetes.io/projected/3b112a53-8bb0-4587-9f69-debaf87494c9-kube-api-access-79tth\") pod \"frr-k8s-webhook-server-8457d999f9-dn8gv\" (UID: \"3b112a53-8bb0-4587-9f69-debaf87494c9\") " pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.905915 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.919727 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924348 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-cert\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924406 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsvr\" (UniqueName: \"kubernetes.io/projected/8d6bd32d-109c-4f18-a8de-057098dae117-kube-api-access-7xsvr\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924436 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd2c4\" (UniqueName: \"kubernetes.io/projected/7138d1ac-8b33-4219-b8fd-303eae7e0334-kube-api-access-rd2c4\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924464 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924482 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-metrics-certs\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924498 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d6bd32d-109c-4f18-a8de-057098dae117-metallb-excludel2\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.924523 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-metrics-certs\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.927089 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-metrics-certs\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.927936 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-metrics-certs\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.927936 4894 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jun 13 05:01:10 crc kubenswrapper[4894]: E0613 05:01:10.928122 4894 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jun 13 05:01:10 crc kubenswrapper[4894]: E0613 05:01:10.928162 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist podName:8d6bd32d-109c-4f18-a8de-057098dae117 nodeName:}" failed. No retries permitted until 2025-06-13 05:01:11.428148841 +0000 UTC m=+629.874396304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist") pod "speaker-5z464" (UID: "8d6bd32d-109c-4f18-a8de-057098dae117") : secret "metallb-memberlist" not found Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.929314 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8d6bd32d-109c-4f18-a8de-057098dae117-metallb-excludel2\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.942491 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd2c4\" (UniqueName: \"kubernetes.io/projected/7138d1ac-8b33-4219-b8fd-303eae7e0334-kube-api-access-rd2c4\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.945092 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7138d1ac-8b33-4219-b8fd-303eae7e0334-cert\") pod \"controller-5f968f88cc-wrmdl\" (UID: \"7138d1ac-8b33-4219-b8fd-303eae7e0334\") " pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:10 crc kubenswrapper[4894]: I0613 05:01:10.953094 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsvr\" (UniqueName: \"kubernetes.io/projected/8d6bd32d-109c-4f18-a8de-057098dae117-kube-api-access-7xsvr\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.011905 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.251509 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5f968f88cc-wrmdl"] Jun 13 05:01:11 crc kubenswrapper[4894]: W0613 05:01:11.255961 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7138d1ac_8b33_4219_b8fd_303eae7e0334.slice/crio-9cca795f5dd5b68099bc97b256a60659e75f5b5b49364751929cfab934e5b0dc WatchSource:0}: Error finding container 9cca795f5dd5b68099bc97b256a60659e75f5b5b49364751929cfab934e5b0dc: Status 404 returned error can't find the container with id 9cca795f5dd5b68099bc97b256a60659e75f5b5b49364751929cfab934e5b0dc Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.321363 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5f968f88cc-wrmdl" event={"ID":"7138d1ac-8b33-4219-b8fd-303eae7e0334","Type":"ContainerStarted","Data":"9cca795f5dd5b68099bc97b256a60659e75f5b5b49364751929cfab934e5b0dc"} Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.322876 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"34b3267b518171202ecb5298d3fe6ca5473463aa9ac2d18dc7ba860e88d45f23"} Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.393550 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv"] Jun 13 05:01:11 crc kubenswrapper[4894]: W0613 05:01:11.400513 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b112a53_8bb0_4587_9f69_debaf87494c9.slice/crio-4d1bb93c568285684e004567134b9311b6940aa9f33d5fa8d0d98c84c8b07af5 WatchSource:0}: Error finding container 4d1bb93c568285684e004567134b9311b6940aa9f33d5fa8d0d98c84c8b07af5: Status 404 returned error can't find the container with id 4d1bb93c568285684e004567134b9311b6940aa9f33d5fa8d0d98c84c8b07af5 Jun 13 05:01:11 crc kubenswrapper[4894]: I0613 05:01:11.436350 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:11 crc kubenswrapper[4894]: E0613 05:01:11.437675 4894 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jun 13 05:01:11 crc kubenswrapper[4894]: E0613 05:01:11.437774 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist podName:8d6bd32d-109c-4f18-a8de-057098dae117 nodeName:}" failed. No retries permitted until 2025-06-13 05:01:12.437745092 +0000 UTC m=+630.883992565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist") pod "speaker-5z464" (UID: "8d6bd32d-109c-4f18-a8de-057098dae117") : secret "metallb-memberlist" not found Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.343955 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5f968f88cc-wrmdl" event={"ID":"7138d1ac-8b33-4219-b8fd-303eae7e0334","Type":"ContainerStarted","Data":"05042fcd4e9e29ba0bd7f5217b3407923611ded544bd18b9e8d865d1d505ef08"} Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.344021 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5f968f88cc-wrmdl" event={"ID":"7138d1ac-8b33-4219-b8fd-303eae7e0334","Type":"ContainerStarted","Data":"4dbcb44a3fa83dbf7fb1e299124e293b753ac4c66d19c29b92b691f86d4a7492"} Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.344142 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.345095 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" event={"ID":"3b112a53-8bb0-4587-9f69-debaf87494c9","Type":"ContainerStarted","Data":"4d1bb93c568285684e004567134b9311b6940aa9f33d5fa8d0d98c84c8b07af5"} Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.368906 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5f968f88cc-wrmdl" podStartSLOduration=2.368882256 podStartE2EDuration="2.368882256s" podCreationTimestamp="2025-06-13 05:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:01:12.360989291 +0000 UTC m=+630.807236784" watchObservedRunningTime="2025-06-13 05:01:12.368882256 +0000 UTC m=+630.815129709" Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.450375 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.458707 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8d6bd32d-109c-4f18-a8de-057098dae117-memberlist\") pod \"speaker-5z464\" (UID: \"8d6bd32d-109c-4f18-a8de-057098dae117\") " pod="metallb-system/speaker-5z464" Jun 13 05:01:12 crc kubenswrapper[4894]: I0613 05:01:12.496728 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5z464" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.356156 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5z464" event={"ID":"8d6bd32d-109c-4f18-a8de-057098dae117","Type":"ContainerStarted","Data":"3f78a73c54e1edb82ec97ce2e7ff754dd13eb551d2e551f6e20dd9c09126db50"} Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.356556 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5z464" event={"ID":"8d6bd32d-109c-4f18-a8de-057098dae117","Type":"ContainerStarted","Data":"7d94881dc43dc3ff1e151742b823c74f1fa950e87c681f046c127bc3d3ea42eb"} Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.356568 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5z464" event={"ID":"8d6bd32d-109c-4f18-a8de-057098dae117","Type":"ContainerStarted","Data":"5cc33b18b643a993a7b28ea6a52139c6430dd21d56708b38a77a82b74a02f39d"} Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.356803 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5z464" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.389543 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5z464" podStartSLOduration=3.38952871 podStartE2EDuration="3.38952871s" podCreationTimestamp="2025-06-13 05:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:01:13.382675064 +0000 UTC m=+631.828922527" watchObservedRunningTime="2025-06-13 05:01:13.38952871 +0000 UTC m=+631.835776173" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.422560 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-b2fgf"] Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.422766 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-b2fgf" podUID="0cb242c4-962c-4e89-8ccf-18e99818513e" containerName="container-00" containerID="cri-o://65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246" gracePeriod=2 Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.435144 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-b2fgf"] Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.483887 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b2fgf" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.566730 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host\") pod \"0cb242c4-962c-4e89-8ccf-18e99818513e\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.566882 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg2n4\" (UniqueName: \"kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4\") pod \"0cb242c4-962c-4e89-8ccf-18e99818513e\" (UID: \"0cb242c4-962c-4e89-8ccf-18e99818513e\") " Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.566821 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host" (OuterVolumeSpecName: "host") pod "0cb242c4-962c-4e89-8ccf-18e99818513e" (UID: "0cb242c4-962c-4e89-8ccf-18e99818513e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.568495 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cb242c4-962c-4e89-8ccf-18e99818513e-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.573860 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4" (OuterVolumeSpecName: "kube-api-access-kg2n4") pod "0cb242c4-962c-4e89-8ccf-18e99818513e" (UID: "0cb242c4-962c-4e89-8ccf-18e99818513e"). InnerVolumeSpecName "kube-api-access-kg2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:13 crc kubenswrapper[4894]: I0613 05:01:13.670287 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg2n4\" (UniqueName: \"kubernetes.io/projected/0cb242c4-962c-4e89-8ccf-18e99818513e-kube-api-access-kg2n4\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.314187 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb242c4-962c-4e89-8ccf-18e99818513e" path="/var/lib/kubelet/pods/0cb242c4-962c-4e89-8ccf-18e99818513e/volumes" Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.387330 4894 generic.go:334] "Generic (PLEG): container finished" podID="0cb242c4-962c-4e89-8ccf-18e99818513e" containerID="65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246" exitCode=0 Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.388279 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b2fgf" Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.388622 4894 scope.go:117] "RemoveContainer" containerID="65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246" Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.427244 4894 scope.go:117] "RemoveContainer" containerID="65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246" Jun 13 05:01:14 crc kubenswrapper[4894]: E0613 05:01:14.432158 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246\": container with ID starting with 65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246 not found: ID does not exist" containerID="65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246" Jun 13 05:01:14 crc kubenswrapper[4894]: I0613 05:01:14.432215 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246"} err="failed to get container status \"65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246\": rpc error: code = NotFound desc = could not find container \"65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246\": container with ID starting with 65ddabe2264b7faae3afe6df664f0441290127734b6b40daa6282ae6c3cdf246 not found: ID does not exist" Jun 13 05:01:20 crc kubenswrapper[4894]: I0613 05:01:20.427019 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" event={"ID":"3b112a53-8bb0-4587-9f69-debaf87494c9","Type":"ContainerStarted","Data":"6381986ac426b465cbc67da3fc6e6c4729e2e9d20f6c001b3473ca2f70a29f0b"} Jun 13 05:01:20 crc kubenswrapper[4894]: I0613 05:01:20.427541 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:20 crc kubenswrapper[4894]: I0613 05:01:20.428612 4894 generic.go:334] "Generic (PLEG): container finished" podID="ce477a54-7315-4d18-9fc3-af3bd9216888" containerID="0d87ab26d6cf5e70bb55af3e63d4aca74e0cf79266d89e42c910857477346804" exitCode=0 Jun 13 05:01:20 crc kubenswrapper[4894]: I0613 05:01:20.428679 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerDied","Data":"0d87ab26d6cf5e70bb55af3e63d4aca74e0cf79266d89e42c910857477346804"} Jun 13 05:01:20 crc kubenswrapper[4894]: I0613 05:01:20.445271 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" podStartSLOduration=2.338936452 podStartE2EDuration="10.445255959s" podCreationTimestamp="2025-06-13 05:01:10 +0000 UTC" firstStartedPulling="2025-06-13 05:01:11.402978732 +0000 UTC m=+629.849226205" lastFinishedPulling="2025-06-13 05:01:19.509298249 +0000 UTC m=+637.955545712" observedRunningTime="2025-06-13 05:01:20.443777157 +0000 UTC m=+638.890024620" watchObservedRunningTime="2025-06-13 05:01:20.445255959 +0000 UTC m=+638.891503422" Jun 13 05:01:21 crc kubenswrapper[4894]: I0613 05:01:21.017904 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5f968f88cc-wrmdl" Jun 13 05:01:21 crc kubenswrapper[4894]: I0613 05:01:21.435976 4894 generic.go:334] "Generic (PLEG): container finished" podID="ce477a54-7315-4d18-9fc3-af3bd9216888" containerID="4fb07971197561e5d524f09d73aa4cbff2289c5a9c832f2589cdccb191226a3d" exitCode=0 Jun 13 05:01:21 crc kubenswrapper[4894]: I0613 05:01:21.436045 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerDied","Data":"4fb07971197561e5d524f09d73aa4cbff2289c5a9c832f2589cdccb191226a3d"} Jun 13 05:01:22 crc kubenswrapper[4894]: I0613 05:01:22.455236 4894 generic.go:334] "Generic (PLEG): container finished" podID="ce477a54-7315-4d18-9fc3-af3bd9216888" containerID="2b0c2caef4a89c9fa7b42037927bbb14b725c1a7ab95df7e55a319cbcf0bcc06" exitCode=0 Jun 13 05:01:22 crc kubenswrapper[4894]: I0613 05:01:22.455477 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerDied","Data":"2b0c2caef4a89c9fa7b42037927bbb14b725c1a7ab95df7e55a319cbcf0bcc06"} Jun 13 05:01:22 crc kubenswrapper[4894]: I0613 05:01:22.507013 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5z464" Jun 13 05:01:23 crc kubenswrapper[4894]: I0613 05:01:23.463465 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"3aa055c0701b823f3d4cf55ed05e2224c48da888ff3182badf7428341688c489"} Jun 13 05:01:23 crc kubenswrapper[4894]: I0613 05:01:23.464031 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"237a5ec1d7caac445d8a86185448eabb8b8d4753a37fd674d83bd5f23c5c553d"} Jun 13 05:01:23 crc kubenswrapper[4894]: I0613 05:01:23.464043 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"bf06a3f87ca733680a97f5f54191105d763fc51e445c4eec00f19e4023122653"} Jun 13 05:01:23 crc kubenswrapper[4894]: I0613 05:01:23.464053 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"c5e28c20286a9d53094a3330fa9ab4ccf256ab3555429b052fa4b3e4b67bc69e"} Jun 13 05:01:23 crc kubenswrapper[4894]: I0613 05:01:23.464062 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"55d18b0688149270d1083ffd61b5ef650820d496fc5a76367089b25afcedaca4"} Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.260764 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.260947 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" containerName="controller-manager" containerID="cri-o://0bfd98c0ce4b67a8067b317e429f98c8261ed79df62d2606ccd098b1f821fa2c" gracePeriod=30 Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.359509 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.359766 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" podUID="5692fd22-0500-4b57-944f-f440839634cc" containerName="route-controller-manager" containerID="cri-o://bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f" gracePeriod=30 Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.482956 4894 generic.go:334] "Generic (PLEG): container finished" podID="00185c58-85dc-4395-9b1e-9662609bd88a" containerID="0bfd98c0ce4b67a8067b317e429f98c8261ed79df62d2606ccd098b1f821fa2c" exitCode=0 Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.483225 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" event={"ID":"00185c58-85dc-4395-9b1e-9662609bd88a","Type":"ContainerDied","Data":"0bfd98c0ce4b67a8067b317e429f98c8261ed79df62d2606ccd098b1f821fa2c"} Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.487123 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kwsg5" event={"ID":"ce477a54-7315-4d18-9fc3-af3bd9216888","Type":"ContainerStarted","Data":"0f057fbb6693c0bdd8acd127c61a479ebcf224e27e7dbabcf494f7d856e3bed3"} Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.487489 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.510673 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kwsg5" podStartSLOduration=6.107598173 podStartE2EDuration="14.510647166s" podCreationTimestamp="2025-06-13 05:01:10 +0000 UTC" firstStartedPulling="2025-06-13 05:01:11.103798136 +0000 UTC m=+629.550045599" lastFinishedPulling="2025-06-13 05:01:19.506847099 +0000 UTC m=+637.953094592" observedRunningTime="2025-06-13 05:01:24.507542037 +0000 UTC m=+642.953789500" watchObservedRunningTime="2025-06-13 05:01:24.510647166 +0000 UTC m=+642.956894629" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.683017 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.746105 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kz8k\" (UniqueName: \"kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k\") pod \"00185c58-85dc-4395-9b1e-9662609bd88a\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.746184 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config\") pod \"00185c58-85dc-4395-9b1e-9662609bd88a\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.747064 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config" (OuterVolumeSpecName: "config") pod "00185c58-85dc-4395-9b1e-9662609bd88a" (UID: "00185c58-85dc-4395-9b1e-9662609bd88a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.754254 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k" (OuterVolumeSpecName: "kube-api-access-8kz8k") pod "00185c58-85dc-4395-9b1e-9662609bd88a" (UID: "00185c58-85dc-4395-9b1e-9662609bd88a"). InnerVolumeSpecName "kube-api-access-8kz8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.789347 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.847450 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles\") pod \"00185c58-85dc-4395-9b1e-9662609bd88a\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.847550 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca\") pod \"00185c58-85dc-4395-9b1e-9662609bd88a\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.847632 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert\") pod \"00185c58-85dc-4395-9b1e-9662609bd88a\" (UID: \"00185c58-85dc-4395-9b1e-9662609bd88a\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848097 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "00185c58-85dc-4395-9b1e-9662609bd88a" (UID: "00185c58-85dc-4395-9b1e-9662609bd88a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848317 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca" (OuterVolumeSpecName: "client-ca") pod "00185c58-85dc-4395-9b1e-9662609bd88a" (UID: "00185c58-85dc-4395-9b1e-9662609bd88a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848668 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kz8k\" (UniqueName: \"kubernetes.io/projected/00185c58-85dc-4395-9b1e-9662609bd88a-kube-api-access-8kz8k\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848686 4894 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848698 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.848708 4894 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00185c58-85dc-4395-9b1e-9662609bd88a-client-ca\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.865019 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00185c58-85dc-4395-9b1e-9662609bd88a" (UID: "00185c58-85dc-4395-9b1e-9662609bd88a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950087 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrdx9\" (UniqueName: \"kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9\") pod \"5692fd22-0500-4b57-944f-f440839634cc\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950175 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config\") pod \"5692fd22-0500-4b57-944f-f440839634cc\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950201 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert\") pod \"5692fd22-0500-4b57-944f-f440839634cc\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950292 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca\") pod \"5692fd22-0500-4b57-944f-f440839634cc\" (UID: \"5692fd22-0500-4b57-944f-f440839634cc\") " Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950505 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00185c58-85dc-4395-9b1e-9662609bd88a-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950847 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "5692fd22-0500-4b57-944f-f440839634cc" (UID: "5692fd22-0500-4b57-944f-f440839634cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.950872 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config" (OuterVolumeSpecName: "config") pod "5692fd22-0500-4b57-944f-f440839634cc" (UID: "5692fd22-0500-4b57-944f-f440839634cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.954327 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9" (OuterVolumeSpecName: "kube-api-access-zrdx9") pod "5692fd22-0500-4b57-944f-f440839634cc" (UID: "5692fd22-0500-4b57-944f-f440839634cc"). InnerVolumeSpecName "kube-api-access-zrdx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:24 crc kubenswrapper[4894]: I0613 05:01:24.954402 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5692fd22-0500-4b57-944f-f440839634cc" (UID: "5692fd22-0500-4b57-944f-f440839634cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.051579 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrdx9\" (UniqueName: \"kubernetes.io/projected/5692fd22-0500-4b57-944f-f440839634cc-kube-api-access-zrdx9\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.051625 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.051636 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5692fd22-0500-4b57-944f-f440839634cc-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.051646 4894 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5692fd22-0500-4b57-944f-f440839634cc-client-ca\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.497142 4894 generic.go:334] "Generic (PLEG): container finished" podID="5692fd22-0500-4b57-944f-f440839634cc" containerID="bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f" exitCode=0 Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.497242 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.497280 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" event={"ID":"5692fd22-0500-4b57-944f-f440839634cc","Type":"ContainerDied","Data":"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f"} Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.499099 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4" event={"ID":"5692fd22-0500-4b57-944f-f440839634cc","Type":"ContainerDied","Data":"054e2cb16241954a4cb5a57daa1a698eb30b95bc753477414e9d86c15408e406"} Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.499150 4894 scope.go:117] "RemoveContainer" containerID="bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.503275 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.513880 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkq2l" event={"ID":"00185c58-85dc-4395-9b1e-9662609bd88a","Type":"ContainerDied","Data":"5a5fac4256d9a2eed078b7f88158218c044fba71e8f3627bb804948aed24a577"} Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.531101 4894 scope.go:117] "RemoveContainer" containerID="bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f" Jun 13 05:01:25 crc kubenswrapper[4894]: E0613 05:01:25.532315 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f\": container with ID starting with bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f not found: ID does not exist" containerID="bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.532392 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f"} err="failed to get container status \"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f\": rpc error: code = NotFound desc = could not find container \"bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f\": container with ID starting with bf29493037d271e227fbcd34ab50894c0743d2de867a9fa382befb624743784f not found: ID does not exist" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.532425 4894 scope.go:117] "RemoveContainer" containerID="0bfd98c0ce4b67a8067b317e429f98c8261ed79df62d2606ccd098b1f821fa2c" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.567101 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.579031 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ls4"] Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.581728 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.584739 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkq2l"] Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.793090 4894 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.907053 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:25 crc kubenswrapper[4894]: I0613 05:01:25.977193 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242123 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.242335 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb242c4-962c-4e89-8ccf-18e99818513e" containerName="container-00" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242345 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb242c4-962c-4e89-8ccf-18e99818513e" containerName="container-00" Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.242357 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" containerName="controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242363 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" containerName="controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.242375 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5692fd22-0500-4b57-944f-f440839634cc" containerName="route-controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242381 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5692fd22-0500-4b57-944f-f440839634cc" containerName="route-controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242472 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5692fd22-0500-4b57-944f-f440839634cc" containerName="route-controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242486 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb242c4-962c-4e89-8ccf-18e99818513e" containerName="container-00" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242498 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" containerName="controller-manager" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.242842 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.246458 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.246571 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.261784 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.265386 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g"] Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.266000 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.269489 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n"] Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.269992 4894 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.270015 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.270056 4894 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.270066 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.270232 4894 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.270246 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.271222 4894 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.271237 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.271296 4894 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.271308 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: W0613 05:01:26.271321 4894 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.271330 4894 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.271531 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281002 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281530 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281645 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281664 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281765 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.281893 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.284148 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00185c58-85dc-4395-9b1e-9662609bd88a" path="/var/lib/kubelet/pods/00185c58-85dc-4395-9b1e-9662609bd88a/volumes" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.284813 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5692fd22-0500-4b57-944f-f440839634cc" path="/var/lib/kubelet/pods/5692fd22-0500-4b57-944f-f440839634cc/volumes" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.286193 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.319275 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g"] Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.321557 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n"] Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.333131 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g"] Jun 13 05:01:26 crc kubenswrapper[4894]: E0613 05:01:26.333568 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-hc6ls serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" podUID="c9e50dca-f8af-47e3-988a-6a9b49d78b07" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371451 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-client-ca\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371493 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-config\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371517 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjtdk\" (UniqueName: \"kubernetes.io/projected/4682135b-b74b-48b3-8c10-c3be096b1d9e-kube-api-access-bjtdk\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371538 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-client-ca\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371733 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371810 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4682135b-b74b-48b3-8c10-c3be096b1d9e-serving-cert\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371867 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-proxy-ca-bundles\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371918 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6ls\" (UniqueName: \"kubernetes.io/projected/c9e50dca-f8af-47e3-988a-6a9b49d78b07-kube-api-access-hc6ls\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371972 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lwz\" (UniqueName: \"kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz\") pod \"openstack-operator-index-gjh7t\" (UID: \"6d7095d7-d711-43ee-bc6f-c86b3cce0d33\") " pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.371994 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e50dca-f8af-47e3-988a-6a9b49d78b07-serving-cert\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473382 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e50dca-f8af-47e3-988a-6a9b49d78b07-serving-cert\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473436 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-client-ca\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473462 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-config\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473488 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjtdk\" (UniqueName: \"kubernetes.io/projected/4682135b-b74b-48b3-8c10-c3be096b1d9e-kube-api-access-bjtdk\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473507 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-client-ca\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473898 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.473924 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4682135b-b74b-48b3-8c10-c3be096b1d9e-serving-cert\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.474709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-proxy-ca-bundles\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.474740 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6ls\" (UniqueName: \"kubernetes.io/projected/c9e50dca-f8af-47e3-988a-6a9b49d78b07-kube-api-access-hc6ls\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.474765 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lwz\" (UniqueName: \"kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz\") pod \"openstack-operator-index-gjh7t\" (UID: \"6d7095d7-d711-43ee-bc6f-c86b3cce0d33\") " pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.474622 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-client-ca\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.474626 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-config\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.476320 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4682135b-b74b-48b3-8c10-c3be096b1d9e-proxy-ca-bundles\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.493753 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4682135b-b74b-48b3-8c10-c3be096b1d9e-serving-cert\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.494736 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lwz\" (UniqueName: \"kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz\") pod \"openstack-operator-index-gjh7t\" (UID: \"6d7095d7-d711-43ee-bc6f-c86b3cce0d33\") " pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.494843 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjtdk\" (UniqueName: \"kubernetes.io/projected/4682135b-b74b-48b3-8c10-c3be096b1d9e-kube-api-access-bjtdk\") pod \"controller-manager-855f6b9fdc-7wh5n\" (UID: \"4682135b-b74b-48b3-8c10-c3be096b1d9e\") " pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.519552 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.543944 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.557274 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.618087 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.857191 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n"] Jun 13 05:01:26 crc kubenswrapper[4894]: I0613 05:01:26.995047 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.470427 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.473757 4894 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.473820 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e50dca-f8af-47e3-988a-6a9b49d78b07-serving-cert podName:c9e50dca-f8af-47e3-988a-6a9b49d78b07 nodeName:}" failed. No retries permitted until 2025-06-13 05:01:27.973799503 +0000 UTC m=+646.420046966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c9e50dca-f8af-47e3-988a-6a9b49d78b07-serving-cert") pod "route-controller-manager-55f9bbb76-kqf9g" (UID: "c9e50dca-f8af-47e3-988a-6a9b49d78b07") : failed to sync secret cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.475227 4894 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.475377 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-client-ca podName:c9e50dca-f8af-47e3-988a-6a9b49d78b07 nodeName:}" failed. No retries permitted until 2025-06-13 05:01:27.975359978 +0000 UTC m=+646.421607441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-client-ca") pod "route-controller-manager-55f9bbb76-kqf9g" (UID: "c9e50dca-f8af-47e3-988a-6a9b49d78b07") : failed to sync configmap cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.476748 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config\") pod \"route-controller-manager-55f9bbb76-kqf9g\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.492403 4894 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.517484 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.521639 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.526679 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" event={"ID":"4682135b-b74b-48b3-8c10-c3be096b1d9e","Type":"ContainerStarted","Data":"74ca32edd6f9748b235b515d74528c8d9b461f28064dae94de82b32b0cddc038"} Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.526725 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" event={"ID":"4682135b-b74b-48b3-8c10-c3be096b1d9e","Type":"ContainerStarted","Data":"c6386b40f6870c01b784880bfa5f9e0314b741f84951ece465bf9fb7f65fde50"} Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.527867 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.530141 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.530947 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.531208 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gjh7t" event={"ID":"6d7095d7-d711-43ee-bc6f-c86b3cce0d33","Type":"ContainerStarted","Data":"bfaeee07359ed4422609883938cc206c0740821e998563b0014e69d8be722d43"} Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.533166 4894 projected.go:194] Error preparing data for projected volume kube-api-access-hc6ls for pod openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g: failed to sync configmap cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: E0613 05:01:27.533245 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9e50dca-f8af-47e3-988a-6a9b49d78b07-kube-api-access-hc6ls podName:c9e50dca-f8af-47e3-988a-6a9b49d78b07 nodeName:}" failed. No retries permitted until 2025-06-13 05:01:28.033223597 +0000 UTC m=+646.479471060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hc6ls" (UniqueName: "kubernetes.io/projected/c9e50dca-f8af-47e3-988a-6a9b49d78b07-kube-api-access-hc6ls") pod "route-controller-manager-55f9bbb76-kqf9g" (UID: "c9e50dca-f8af-47e3-988a-6a9b49d78b07") : failed to sync configmap cache: timed out waiting for the condition Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.535361 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.549069 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-855f6b9fdc-7wh5n" podStartSLOduration=3.549051317 podStartE2EDuration="3.549051317s" podCreationTimestamp="2025-06-13 05:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:01:27.540762941 +0000 UTC m=+645.987010404" watchObservedRunningTime="2025-06-13 05:01:27.549051317 +0000 UTC m=+645.995298780" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.571722 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.596053 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config\") pod \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\" (UID: \"c9e50dca-f8af-47e3-988a-6a9b49d78b07\") " Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.596584 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config" (OuterVolumeSpecName: "config") pod "c9e50dca-f8af-47e3-988a-6a9b49d78b07" (UID: "c9e50dca-f8af-47e3-988a-6a9b49d78b07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.697751 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.716057 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.875997 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj"] Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.878490 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.879394 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g"] Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.883043 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.883222 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.883308 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.883389 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f9bbb76-kqf9g"] Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.883110 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.884391 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.884496 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jun 13 05:01:27 crc kubenswrapper[4894]: I0613 05:01:27.886367 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj"] Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.000844 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfll\" (UniqueName: \"kubernetes.io/projected/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-kube-api-access-5mfll\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.000916 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-config\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.001108 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-client-ca\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.001237 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-serving-cert\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.001357 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc6ls\" (UniqueName: \"kubernetes.io/projected/c9e50dca-f8af-47e3-988a-6a9b49d78b07-kube-api-access-hc6ls\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.001379 4894 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e50dca-f8af-47e3-988a-6a9b49d78b07-serving-cert\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.001388 4894 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9e50dca-f8af-47e3-988a-6a9b49d78b07-client-ca\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.102780 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfll\" (UniqueName: \"kubernetes.io/projected/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-kube-api-access-5mfll\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.102878 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-config\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.102933 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-client-ca\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.102981 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-serving-cert\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.103829 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-client-ca\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.106212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-config\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.129318 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-serving-cert\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.132640 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfll\" (UniqueName: \"kubernetes.io/projected/66c27c13-2e06-46ce-bdb7-f95fa3b5281c-kube-api-access-5mfll\") pod \"route-controller-manager-7f8c58ff9c-xzwdj\" (UID: \"66c27c13-2e06-46ce-bdb7-f95fa3b5281c\") " pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.199009 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:28 crc kubenswrapper[4894]: I0613 05:01:28.287746 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e50dca-f8af-47e3-988a-6a9b49d78b07" path="/var/lib/kubelet/pods/c9e50dca-f8af-47e3-988a-6a9b49d78b07/volumes" Jun 13 05:01:29 crc kubenswrapper[4894]: I0613 05:01:29.623978 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:29 crc kubenswrapper[4894]: I0613 05:01:29.916994 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj"] Jun 13 05:01:29 crc kubenswrapper[4894]: W0613 05:01:29.926952 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c27c13_2e06_46ce_bdb7_f95fa3b5281c.slice/crio-a5e6f9d563a37d28231dfdc17118fe4de8355538a361bf2810e8a0f5ff698619 WatchSource:0}: Error finding container a5e6f9d563a37d28231dfdc17118fe4de8355538a361bf2810e8a0f5ff698619: Status 404 returned error can't find the container with id a5e6f9d563a37d28231dfdc17118fe4de8355538a361bf2810e8a0f5ff698619 Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.233985 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c727t"] Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.236380 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.239304 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-lc8rd" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.248052 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c727t"] Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.335299 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276m2\" (UniqueName: \"kubernetes.io/projected/7f203b92-325f-4c66-9c50-6269d6f628cf-kube-api-access-276m2\") pod \"openstack-operator-index-c727t\" (UID: \"7f203b92-325f-4c66-9c50-6269d6f628cf\") " pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.437157 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276m2\" (UniqueName: \"kubernetes.io/projected/7f203b92-325f-4c66-9c50-6269d6f628cf-kube-api-access-276m2\") pod \"openstack-operator-index-c727t\" (UID: \"7f203b92-325f-4c66-9c50-6269d6f628cf\") " pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.462041 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276m2\" (UniqueName: \"kubernetes.io/projected/7f203b92-325f-4c66-9c50-6269d6f628cf-kube-api-access-276m2\") pod \"openstack-operator-index-c727t\" (UID: \"7f203b92-325f-4c66-9c50-6269d6f628cf\") " pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.578037 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gjh7t" event={"ID":"6d7095d7-d711-43ee-bc6f-c86b3cce0d33","Type":"ContainerStarted","Data":"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867"} Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.578127 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-gjh7t" podUID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" containerName="registry-server" containerID="cri-o://5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867" gracePeriod=2 Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.581330 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" event={"ID":"66c27c13-2e06-46ce-bdb7-f95fa3b5281c","Type":"ContainerStarted","Data":"a46c99d710f36190f917267fc2b806d93e1ee32f88a15d511733d56f68b8fc80"} Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.581370 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" event={"ID":"66c27c13-2e06-46ce-bdb7-f95fa3b5281c","Type":"ContainerStarted","Data":"a5e6f9d563a37d28231dfdc17118fe4de8355538a361bf2810e8a0f5ff698619"} Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.581828 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.587323 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.599767 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.604136 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gjh7t" podStartSLOduration=1.951834251 podStartE2EDuration="4.60411669s" podCreationTimestamp="2025-06-13 05:01:26 +0000 UTC" firstStartedPulling="2025-06-13 05:01:27.015913396 +0000 UTC m=+645.462160849" lastFinishedPulling="2025-06-13 05:01:29.668195815 +0000 UTC m=+648.114443288" observedRunningTime="2025-06-13 05:01:30.599720556 +0000 UTC m=+649.045968029" watchObservedRunningTime="2025-06-13 05:01:30.60411669 +0000 UTC m=+649.050364163" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.926544 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-8457d999f9-dn8gv" Jun 13 05:01:30 crc kubenswrapper[4894]: I0613 05:01:30.951732 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8c58ff9c-xzwdj" podStartSLOduration=4.951713708 podStartE2EDuration="4.951713708s" podCreationTimestamp="2025-06-13 05:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:01:30.624040201 +0000 UTC m=+649.070287674" watchObservedRunningTime="2025-06-13 05:01:30.951713708 +0000 UTC m=+649.397961171" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.078929 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c727t"] Jun 13 05:01:31 crc kubenswrapper[4894]: W0613 05:01:31.083445 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f203b92_325f_4c66_9c50_6269d6f628cf.slice/crio-95f9c71218ce9a84341ff988cbf556ec3bf7d1833de6455c11ade497fdf629ce WatchSource:0}: Error finding container 95f9c71218ce9a84341ff988cbf556ec3bf7d1833de6455c11ade497fdf629ce: Status 404 returned error can't find the container with id 95f9c71218ce9a84341ff988cbf556ec3bf7d1833de6455c11ade497fdf629ce Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.101636 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.250083 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lwz\" (UniqueName: \"kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz\") pod \"6d7095d7-d711-43ee-bc6f-c86b3cce0d33\" (UID: \"6d7095d7-d711-43ee-bc6f-c86b3cce0d33\") " Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.257001 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz" (OuterVolumeSpecName: "kube-api-access-q6lwz") pod "6d7095d7-d711-43ee-bc6f-c86b3cce0d33" (UID: "6d7095d7-d711-43ee-bc6f-c86b3cce0d33"). InnerVolumeSpecName "kube-api-access-q6lwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.352090 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lwz\" (UniqueName: \"kubernetes.io/projected/6d7095d7-d711-43ee-bc6f-c86b3cce0d33-kube-api-access-q6lwz\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.593499 4894 generic.go:334] "Generic (PLEG): container finished" podID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" containerID="5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867" exitCode=0 Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.593836 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gjh7t" event={"ID":"6d7095d7-d711-43ee-bc6f-c86b3cce0d33","Type":"ContainerDied","Data":"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867"} Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.593920 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gjh7t" event={"ID":"6d7095d7-d711-43ee-bc6f-c86b3cce0d33","Type":"ContainerDied","Data":"bfaeee07359ed4422609883938cc206c0740821e998563b0014e69d8be722d43"} Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.593971 4894 scope.go:117] "RemoveContainer" containerID="5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.594241 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gjh7t" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.598760 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c727t" event={"ID":"7f203b92-325f-4c66-9c50-6269d6f628cf","Type":"ContainerStarted","Data":"1fb7469e6592129fc4488eda52e591c836e0d4f64784c4efc7a775dc46f8ed0f"} Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.598813 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c727t" event={"ID":"7f203b92-325f-4c66-9c50-6269d6f628cf","Type":"ContainerStarted","Data":"95f9c71218ce9a84341ff988cbf556ec3bf7d1833de6455c11ade497fdf629ce"} Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.623055 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c727t" podStartSLOduration=1.5764190500000002 podStartE2EDuration="1.623027583s" podCreationTimestamp="2025-06-13 05:01:30 +0000 UTC" firstStartedPulling="2025-06-13 05:01:31.08710013 +0000 UTC m=+649.533347593" lastFinishedPulling="2025-06-13 05:01:31.133708663 +0000 UTC m=+649.579956126" observedRunningTime="2025-06-13 05:01:31.621806238 +0000 UTC m=+650.068053711" watchObservedRunningTime="2025-06-13 05:01:31.623027583 +0000 UTC m=+650.069275076" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.633458 4894 scope.go:117] "RemoveContainer" containerID="5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867" Jun 13 05:01:31 crc kubenswrapper[4894]: E0613 05:01:31.634264 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867\": container with ID starting with 5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867 not found: ID does not exist" containerID="5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.634376 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867"} err="failed to get container status \"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867\": rpc error: code = NotFound desc = could not find container \"5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867\": container with ID starting with 5e4dc4118ae70429ec8b06151c79c1e9028208e3146508cd3ce62938d860e867 not found: ID does not exist" Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.648219 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:31 crc kubenswrapper[4894]: I0613 05:01:31.654470 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-gjh7t"] Jun 13 05:01:32 crc kubenswrapper[4894]: I0613 05:01:32.290482 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" path="/var/lib/kubelet/pods/6d7095d7-d711-43ee-bc6f-c86b3cce0d33/volumes" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.242391 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:38 crc kubenswrapper[4894]: E0613 05:01:38.243830 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" containerName="registry-server" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.243860 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" containerName="registry-server" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.244373 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7095d7-d711-43ee-bc6f-c86b3cce0d33" containerName="registry-server" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.246040 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.263423 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.364180 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.364274 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxsf\" (UniqueName: \"kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.364421 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.465931 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.466014 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.466036 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxsf\" (UniqueName: \"kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.466392 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.466544 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.497427 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxsf\" (UniqueName: \"kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf\") pod \"certified-operators-mzrvb\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:38 crc kubenswrapper[4894]: I0613 05:01:38.574188 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:39 crc kubenswrapper[4894]: I0613 05:01:39.051520 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:39 crc kubenswrapper[4894]: I0613 05:01:39.656995 4894 generic.go:334] "Generic (PLEG): container finished" podID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerID="ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b" exitCode=0 Jun 13 05:01:39 crc kubenswrapper[4894]: I0613 05:01:39.657112 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerDied","Data":"ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b"} Jun 13 05:01:39 crc kubenswrapper[4894]: I0613 05:01:39.657283 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerStarted","Data":"8438a005ecb0bec5ae557b06a5a664501727b29d30ea8c5b7289f01034e4dd1a"} Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.600959 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.601005 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.651538 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.666705 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerStarted","Data":"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc"} Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.715389 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c727t" Jun 13 05:01:40 crc kubenswrapper[4894]: I0613 05:01:40.909576 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kwsg5" Jun 13 05:01:41 crc kubenswrapper[4894]: I0613 05:01:41.676472 4894 generic.go:334] "Generic (PLEG): container finished" podID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerID="dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc" exitCode=0 Jun 13 05:01:41 crc kubenswrapper[4894]: I0613 05:01:41.676543 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerDied","Data":"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc"} Jun 13 05:01:42 crc kubenswrapper[4894]: I0613 05:01:42.687282 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerStarted","Data":"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533"} Jun 13 05:01:42 crc kubenswrapper[4894]: I0613 05:01:42.710842 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mzrvb" podStartSLOduration=2.080683578 podStartE2EDuration="4.710829464s" podCreationTimestamp="2025-06-13 05:01:38 +0000 UTC" firstStartedPulling="2025-06-13 05:01:39.659153897 +0000 UTC m=+658.105401390" lastFinishedPulling="2025-06-13 05:01:42.289299803 +0000 UTC m=+660.735547276" observedRunningTime="2025-06-13 05:01:42.708589391 +0000 UTC m=+661.154836854" watchObservedRunningTime="2025-06-13 05:01:42.710829464 +0000 UTC m=+661.157076927" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.064003 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4"] Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.065377 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.067189 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vqjvc" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.084242 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4"] Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.138731 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.138796 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbvz\" (UniqueName: \"kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.138853 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.240494 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.240791 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbvz\" (UniqueName: \"kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.240901 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.241028 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.241324 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.267494 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbvz\" (UniqueName: \"kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz\") pod \"3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.377589 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:43 crc kubenswrapper[4894]: I0613 05:01:43.787136 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4"] Jun 13 05:01:44 crc kubenswrapper[4894]: I0613 05:01:44.701257 4894 generic.go:334] "Generic (PLEG): container finished" podID="318fa333-aef2-42ac-bd8c-6232198b5093" containerID="b7e45af893260783d502f09e46cc30010d1d8d530f9a9f84bea1cf091c0ecaa2" exitCode=0 Jun 13 05:01:44 crc kubenswrapper[4894]: I0613 05:01:44.701377 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" event={"ID":"318fa333-aef2-42ac-bd8c-6232198b5093","Type":"ContainerDied","Data":"b7e45af893260783d502f09e46cc30010d1d8d530f9a9f84bea1cf091c0ecaa2"} Jun 13 05:01:44 crc kubenswrapper[4894]: I0613 05:01:44.701556 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" event={"ID":"318fa333-aef2-42ac-bd8c-6232198b5093","Type":"ContainerStarted","Data":"d9845297c7c6178951456a6da92c8179669ede0e251551552b78c5910ff739e2"} Jun 13 05:01:45 crc kubenswrapper[4894]: I0613 05:01:45.710236 4894 generic.go:334] "Generic (PLEG): container finished" podID="318fa333-aef2-42ac-bd8c-6232198b5093" containerID="50bc917615fc4d7867d7845fbe0210ae282d0d7e5b75d837a02296a4b758e460" exitCode=0 Jun 13 05:01:45 crc kubenswrapper[4894]: I0613 05:01:45.710296 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" event={"ID":"318fa333-aef2-42ac-bd8c-6232198b5093","Type":"ContainerDied","Data":"50bc917615fc4d7867d7845fbe0210ae282d0d7e5b75d837a02296a4b758e460"} Jun 13 05:01:46 crc kubenswrapper[4894]: I0613 05:01:46.720991 4894 generic.go:334] "Generic (PLEG): container finished" podID="318fa333-aef2-42ac-bd8c-6232198b5093" containerID="f251a03966dadc8577b1ca005c30b3c61df0381649ffd119b4a1fe85a27de231" exitCode=0 Jun 13 05:01:46 crc kubenswrapper[4894]: I0613 05:01:46.721124 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" event={"ID":"318fa333-aef2-42ac-bd8c-6232198b5093","Type":"ContainerDied","Data":"f251a03966dadc8577b1ca005c30b3c61df0381649ffd119b4a1fe85a27de231"} Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.174316 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.328323 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle\") pod \"318fa333-aef2-42ac-bd8c-6232198b5093\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.328418 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbvz\" (UniqueName: \"kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz\") pod \"318fa333-aef2-42ac-bd8c-6232198b5093\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.328518 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util\") pod \"318fa333-aef2-42ac-bd8c-6232198b5093\" (UID: \"318fa333-aef2-42ac-bd8c-6232198b5093\") " Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.329562 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle" (OuterVolumeSpecName: "bundle") pod "318fa333-aef2-42ac-bd8c-6232198b5093" (UID: "318fa333-aef2-42ac-bd8c-6232198b5093"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.337007 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz" (OuterVolumeSpecName: "kube-api-access-zwbvz") pod "318fa333-aef2-42ac-bd8c-6232198b5093" (UID: "318fa333-aef2-42ac-bd8c-6232198b5093"). InnerVolumeSpecName "kube-api-access-zwbvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.357501 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util" (OuterVolumeSpecName: "util") pod "318fa333-aef2-42ac-bd8c-6232198b5093" (UID: "318fa333-aef2-42ac-bd8c-6232198b5093"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.431728 4894 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-util\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.431770 4894 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/318fa333-aef2-42ac-bd8c-6232198b5093-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.431793 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbvz\" (UniqueName: \"kubernetes.io/projected/318fa333-aef2-42ac-bd8c-6232198b5093-kube-api-access-zwbvz\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.575151 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.575563 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.649976 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.737092 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.737106 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4" event={"ID":"318fa333-aef2-42ac-bd8c-6232198b5093","Type":"ContainerDied","Data":"d9845297c7c6178951456a6da92c8179669ede0e251551552b78c5910ff739e2"} Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.737373 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9845297c7c6178951456a6da92c8179669ede0e251551552b78c5910ff739e2" Jun 13 05:01:48 crc kubenswrapper[4894]: I0613 05:01:48.772628 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:49 crc kubenswrapper[4894]: I0613 05:01:49.232686 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:50 crc kubenswrapper[4894]: I0613 05:01:50.750796 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mzrvb" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="registry-server" containerID="cri-o://5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533" gracePeriod=2 Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.324375 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.484167 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities\") pod \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.484213 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content\") pod \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.484318 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzxsf\" (UniqueName: \"kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf\") pod \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\" (UID: \"f01f8fd6-b810-44f1-8ee3-52eaa35486f3\") " Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.485472 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities" (OuterVolumeSpecName: "utilities") pod "f01f8fd6-b810-44f1-8ee3-52eaa35486f3" (UID: "f01f8fd6-b810-44f1-8ee3-52eaa35486f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.492989 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf" (OuterVolumeSpecName: "kube-api-access-lzxsf") pod "f01f8fd6-b810-44f1-8ee3-52eaa35486f3" (UID: "f01f8fd6-b810-44f1-8ee3-52eaa35486f3"). InnerVolumeSpecName "kube-api-access-lzxsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.524392 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f01f8fd6-b810-44f1-8ee3-52eaa35486f3" (UID: "f01f8fd6-b810-44f1-8ee3-52eaa35486f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.585243 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzxsf\" (UniqueName: \"kubernetes.io/projected/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-kube-api-access-lzxsf\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.585459 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.585540 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f8fd6-b810-44f1-8ee3-52eaa35486f3-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.760494 4894 generic.go:334] "Generic (PLEG): container finished" podID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerID="5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533" exitCode=0 Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.760558 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerDied","Data":"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533"} Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.760597 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzrvb" event={"ID":"f01f8fd6-b810-44f1-8ee3-52eaa35486f3","Type":"ContainerDied","Data":"8438a005ecb0bec5ae557b06a5a664501727b29d30ea8c5b7289f01034e4dd1a"} Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.760627 4894 scope.go:117] "RemoveContainer" containerID="5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.760912 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzrvb" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.786512 4894 scope.go:117] "RemoveContainer" containerID="dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.820898 4894 scope.go:117] "RemoveContainer" containerID="ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.822257 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.831680 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mzrvb"] Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.840799 4894 scope.go:117] "RemoveContainer" containerID="5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533" Jun 13 05:01:51 crc kubenswrapper[4894]: E0613 05:01:51.841294 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533\": container with ID starting with 5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533 not found: ID does not exist" containerID="5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.841343 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533"} err="failed to get container status \"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533\": rpc error: code = NotFound desc = could not find container \"5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533\": container with ID starting with 5bc37583429436666385fed0ef6684f9e2a6ebffac95991df56e38f7826f9533 not found: ID does not exist" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.841375 4894 scope.go:117] "RemoveContainer" containerID="dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc" Jun 13 05:01:51 crc kubenswrapper[4894]: E0613 05:01:51.841765 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc\": container with ID starting with dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc not found: ID does not exist" containerID="dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.841809 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc"} err="failed to get container status \"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc\": rpc error: code = NotFound desc = could not find container \"dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc\": container with ID starting with dedb9fed1913fffca392f56ed8d6ca56ab53fd3a108550889fb9a7625a7435fc not found: ID does not exist" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.841834 4894 scope.go:117] "RemoveContainer" containerID="ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b" Jun 13 05:01:51 crc kubenswrapper[4894]: E0613 05:01:51.842102 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b\": container with ID starting with ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b not found: ID does not exist" containerID="ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b" Jun 13 05:01:51 crc kubenswrapper[4894]: I0613 05:01:51.842141 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b"} err="failed to get container status \"ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b\": rpc error: code = NotFound desc = could not find container \"ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b\": container with ID starting with ccb3bd4941d1842084cbf511d00d7057effc4d9fd73f497ae1eba7cd262cb54b not found: ID does not exist" Jun 13 05:01:52 crc kubenswrapper[4894]: I0613 05:01:52.289642 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" path="/var/lib/kubelet/pods/f01f8fd6-b810-44f1-8ee3-52eaa35486f3/volumes" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.124586 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g"] Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125082 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="pull" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125096 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="pull" Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125118 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="util" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125126 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="util" Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125140 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="extract" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125148 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="extract" Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125168 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="extract-utilities" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125177 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="extract-utilities" Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125188 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="extract-content" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125196 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="extract-content" Jun 13 05:01:55 crc kubenswrapper[4894]: E0613 05:01:55.125205 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="registry-server" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125213 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="registry-server" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125338 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01f8fd6-b810-44f1-8ee3-52eaa35486f3" containerName="registry-server" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.125348 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="318fa333-aef2-42ac-bd8c-6232198b5093" containerName="extract" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.126050 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.132000 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-6kw9r" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.145446 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g"] Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.233925 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/712d94c9-1d98-4ff5-8af0-d15cab94e874-kube-api-access-b857j\") pod \"openstack-operator-controller-operator-d66c4c8c7-6cm6g\" (UID: \"712d94c9-1d98-4ff5-8af0-d15cab94e874\") " pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.335039 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/712d94c9-1d98-4ff5-8af0-d15cab94e874-kube-api-access-b857j\") pod \"openstack-operator-controller-operator-d66c4c8c7-6cm6g\" (UID: \"712d94c9-1d98-4ff5-8af0-d15cab94e874\") " pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.352439 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/712d94c9-1d98-4ff5-8af0-d15cab94e874-kube-api-access-b857j\") pod \"openstack-operator-controller-operator-d66c4c8c7-6cm6g\" (UID: \"712d94c9-1d98-4ff5-8af0-d15cab94e874\") " pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.452583 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:01:55 crc kubenswrapper[4894]: I0613 05:01:55.954755 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g"] Jun 13 05:01:56 crc kubenswrapper[4894]: I0613 05:01:56.792270 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" event={"ID":"712d94c9-1d98-4ff5-8af0-d15cab94e874","Type":"ContainerStarted","Data":"1c4493adb75d1b3c5b0d86e70ad6da75b90afe727d02ee8c867d3af090f68520"} Jun 13 05:02:00 crc kubenswrapper[4894]: I0613 05:02:00.820602 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" event={"ID":"712d94c9-1d98-4ff5-8af0-d15cab94e874","Type":"ContainerStarted","Data":"8004d56c808e2f37f4e445eac911a10137c245bac60a288e43b8aa926883632f"} Jun 13 05:02:01 crc kubenswrapper[4894]: I0613 05:02:01.766140 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/crc-debug-9rn6l"] Jun 13 05:02:01 crc kubenswrapper[4894]: I0613 05:02:01.767568 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:01 crc kubenswrapper[4894]: I0613 05:02:01.770839 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vqjvc" Jun 13 05:02:01 crc kubenswrapper[4894]: I0613 05:02:01.946853 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzjh\" (UniqueName: \"kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:01 crc kubenswrapper[4894]: I0613 05:02:01.946957 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.048495 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzjh\" (UniqueName: \"kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.048722 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.048856 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.069444 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzjh\" (UniqueName: \"kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh\") pod \"crc-debug-9rn6l\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.089846 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.841597 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" event={"ID":"712d94c9-1d98-4ff5-8af0-d15cab94e874","Type":"ContainerStarted","Data":"dadb1de64d3107c7d402c7c2b75591c9e578065415ac920130f9b9f41eaa9d7d"} Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.842176 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.843908 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/crc-debug-9rn6l" event={"ID":"c0cc030a-5bd7-432b-8859-c4cbf960324d","Type":"ContainerStarted","Data":"61cc63063a9af5968d10a448fc18e286abc113d8a9af96351c1e811904a0abc6"} Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.843975 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/crc-debug-9rn6l" event={"ID":"c0cc030a-5bd7-432b-8859-c4cbf960324d","Type":"ContainerStarted","Data":"e6ed06868cd0cd0de85bd3a8e3b655a24cd5ac8082156565a8c975068710c345"} Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.873275 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" podStartSLOduration=1.307415225 podStartE2EDuration="7.873256135s" podCreationTimestamp="2025-06-13 05:01:55 +0000 UTC" firstStartedPulling="2025-06-13 05:01:55.971547457 +0000 UTC m=+674.417794930" lastFinishedPulling="2025-06-13 05:02:02.537388367 +0000 UTC m=+680.983635840" observedRunningTime="2025-06-13 05:02:02.86951993 +0000 UTC m=+681.315767403" watchObservedRunningTime="2025-06-13 05:02:02.873256135 +0000 UTC m=+681.319503608" Jun 13 05:02:02 crc kubenswrapper[4894]: I0613 05:02:02.891100 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/crc-debug-9rn6l" podStartSLOduration=1.891077257 podStartE2EDuration="1.891077257s" podCreationTimestamp="2025-06-13 05:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:02:02.885252723 +0000 UTC m=+681.331500196" watchObservedRunningTime="2025-06-13 05:02:02.891077257 +0000 UTC m=+681.337324730" Jun 13 05:02:05 crc kubenswrapper[4894]: I0613 05:02:05.456479 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d66c4c8c7-6cm6g" Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.864502 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/crc-debug-9rn6l"] Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.865180 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/crc-debug-9rn6l" podUID="c0cc030a-5bd7-432b-8859-c4cbf960324d" containerName="container-00" containerID="cri-o://61cc63063a9af5968d10a448fc18e286abc113d8a9af96351c1e811904a0abc6" gracePeriod=2 Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.868768 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/crc-debug-9rn6l"] Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.922799 4894 generic.go:334] "Generic (PLEG): container finished" podID="c0cc030a-5bd7-432b-8859-c4cbf960324d" containerID="61cc63063a9af5968d10a448fc18e286abc113d8a9af96351c1e811904a0abc6" exitCode=0 Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.922862 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ed06868cd0cd0de85bd3a8e3b655a24cd5ac8082156565a8c975068710c345" Jun 13 05:02:12 crc kubenswrapper[4894]: I0613 05:02:12.929602 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.123134 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host\") pod \"c0cc030a-5bd7-432b-8859-c4cbf960324d\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.123280 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hzjh\" (UniqueName: \"kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh\") pod \"c0cc030a-5bd7-432b-8859-c4cbf960324d\" (UID: \"c0cc030a-5bd7-432b-8859-c4cbf960324d\") " Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.123560 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host" (OuterVolumeSpecName: "host") pod "c0cc030a-5bd7-432b-8859-c4cbf960324d" (UID: "c0cc030a-5bd7-432b-8859-c4cbf960324d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.129337 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh" (OuterVolumeSpecName: "kube-api-access-4hzjh") pod "c0cc030a-5bd7-432b-8859-c4cbf960324d" (UID: "c0cc030a-5bd7-432b-8859-c4cbf960324d"). InnerVolumeSpecName "kube-api-access-4hzjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.224484 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hzjh\" (UniqueName: \"kubernetes.io/projected/c0cc030a-5bd7-432b-8859-c4cbf960324d-kube-api-access-4hzjh\") on node \"crc\" DevicePath \"\"" Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.224791 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0cc030a-5bd7-432b-8859-c4cbf960324d-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:02:13 crc kubenswrapper[4894]: I0613 05:02:13.928302 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-9rn6l" Jun 13 05:02:14 crc kubenswrapper[4894]: I0613 05:02:14.284432 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cc030a-5bd7-432b-8859-c4cbf960324d" path="/var/lib/kubelet/pods/c0cc030a-5bd7-432b-8859-c4cbf960324d/volumes" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.130430 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:15 crc kubenswrapper[4894]: E0613 05:02:15.130913 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cc030a-5bd7-432b-8859-c4cbf960324d" containerName="container-00" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.130926 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cc030a-5bd7-432b-8859-c4cbf960324d" containerName="container-00" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.131042 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0cc030a-5bd7-432b-8859-c4cbf960324d" containerName="container-00" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.131858 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.143587 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.149169 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.149231 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.149261 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crd8l\" (UniqueName: \"kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.250630 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.250736 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.250949 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crd8l\" (UniqueName: \"kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.251151 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.251192 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.266760 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crd8l\" (UniqueName: \"kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l\") pod \"redhat-marketplace-pxld2\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.445501 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.878460 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:15 crc kubenswrapper[4894]: W0613 05:02:15.888042 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930f2c7c_e3fb_4f72_8c7e_43853b1d2a09.slice/crio-4e4a7e7216552f95b05d51bb4cfcd2ef634b82c63203ab689bff05d4970b7d5b WatchSource:0}: Error finding container 4e4a7e7216552f95b05d51bb4cfcd2ef634b82c63203ab689bff05d4970b7d5b: Status 404 returned error can't find the container with id 4e4a7e7216552f95b05d51bb4cfcd2ef634b82c63203ab689bff05d4970b7d5b Jun 13 05:02:15 crc kubenswrapper[4894]: I0613 05:02:15.940277 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerStarted","Data":"4e4a7e7216552f95b05d51bb4cfcd2ef634b82c63203ab689bff05d4970b7d5b"} Jun 13 05:02:16 crc kubenswrapper[4894]: I0613 05:02:16.947686 4894 generic.go:334] "Generic (PLEG): container finished" podID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerID="a0db7f77e7b984c3186c196de6eb2026ef78c07dae7159f1922615e82f9a6a76" exitCode=0 Jun 13 05:02:16 crc kubenswrapper[4894]: I0613 05:02:16.947730 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerDied","Data":"a0db7f77e7b984c3186c196de6eb2026ef78c07dae7159f1922615e82f9a6a76"} Jun 13 05:02:17 crc kubenswrapper[4894]: I0613 05:02:17.954778 4894 generic.go:334] "Generic (PLEG): container finished" podID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerID="3eb157c277002d16cf91a0df424f82c71b76f02b0f68f2efe9c4b30452dc3178" exitCode=0 Jun 13 05:02:17 crc kubenswrapper[4894]: I0613 05:02:17.954882 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerDied","Data":"3eb157c277002d16cf91a0df424f82c71b76f02b0f68f2efe9c4b30452dc3178"} Jun 13 05:02:18 crc kubenswrapper[4894]: I0613 05:02:18.962678 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerStarted","Data":"7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc"} Jun 13 05:02:18 crc kubenswrapper[4894]: I0613 05:02:18.980474 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxld2" podStartSLOduration=2.446525342 podStartE2EDuration="3.980458328s" podCreationTimestamp="2025-06-13 05:02:15 +0000 UTC" firstStartedPulling="2025-06-13 05:02:16.949069243 +0000 UTC m=+695.395316706" lastFinishedPulling="2025-06-13 05:02:18.483002229 +0000 UTC m=+696.929249692" observedRunningTime="2025-06-13 05:02:18.976506937 +0000 UTC m=+697.422754400" watchObservedRunningTime="2025-06-13 05:02:18.980458328 +0000 UTC m=+697.426705791" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.037541 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.038629 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.042068 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nhwqb" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.046365 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.055002 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.055861 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.058756 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-kg5tx" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.082329 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.093993 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-b554678df-6trss"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.095513 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.123238 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xdzkj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.127433 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jql\" (UniqueName: \"kubernetes.io/projected/6e780a91-140a-4b7b-9748-c3a6c3b954e1-kube-api-access-d6jql\") pod \"designate-operator-controller-manager-b554678df-6trss\" (UID: \"6e780a91-140a-4b7b-9748-c3a6c3b954e1\") " pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.127478 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwzg\" (UniqueName: \"kubernetes.io/projected/250d2934-5f6e-4d4f-96d9-ec258c71909e-kube-api-access-ncwzg\") pod \"cinder-operator-controller-manager-57f4dc9749-rf6b7\" (UID: \"250d2934-5f6e-4d4f-96d9-ec258c71909e\") " pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.127529 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxk6t\" (UniqueName: \"kubernetes.io/projected/784a682d-1749-4399-a1f4-1e8bee7968ce-kube-api-access-gxk6t\") pod \"barbican-operator-controller-manager-9889b4756-lsslv\" (UID: \"784a682d-1749-4399-a1f4-1e8bee7968ce\") " pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.127678 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.128992 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.133254 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-cxnjg" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.146519 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b554678df-6trss"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.178567 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.189691 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.190758 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.194540 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mmkfs" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.210730 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.218773 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.219798 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.227277 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wp57v" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228353 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6jz\" (UniqueName: \"kubernetes.io/projected/ea63dc95-4a48-4ed4-b990-c6990bbe3d33-kube-api-access-kl6jz\") pod \"heat-operator-controller-manager-5486f4b54f-xdn4k\" (UID: \"ea63dc95-4a48-4ed4-b990-c6990bbe3d33\") " pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228379 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpshg\" (UniqueName: \"kubernetes.io/projected/7d3873c8-7bab-42f8-918a-344d87eacce9-kube-api-access-zpshg\") pod \"glance-operator-controller-manager-97b97479c-jw8m6\" (UID: \"7d3873c8-7bab-42f8-918a-344d87eacce9\") " pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228409 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jql\" (UniqueName: \"kubernetes.io/projected/6e780a91-140a-4b7b-9748-c3a6c3b954e1-kube-api-access-d6jql\") pod \"designate-operator-controller-manager-b554678df-6trss\" (UID: \"6e780a91-140a-4b7b-9748-c3a6c3b954e1\") " pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228427 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwzg\" (UniqueName: \"kubernetes.io/projected/250d2934-5f6e-4d4f-96d9-ec258c71909e-kube-api-access-ncwzg\") pod \"cinder-operator-controller-manager-57f4dc9749-rf6b7\" (UID: \"250d2934-5f6e-4d4f-96d9-ec258c71909e\") " pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228460 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxk6t\" (UniqueName: \"kubernetes.io/projected/784a682d-1749-4399-a1f4-1e8bee7968ce-kube-api-access-gxk6t\") pod \"barbican-operator-controller-manager-9889b4756-lsslv\" (UID: \"784a682d-1749-4399-a1f4-1e8bee7968ce\") " pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.228480 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h69g\" (UniqueName: \"kubernetes.io/projected/297946dc-5d6d-4389-bff3-3044865254ef-kube-api-access-4h69g\") pod \"horizon-operator-controller-manager-7777cf768b-bm84t\" (UID: \"297946dc-5d6d-4389-bff3-3044865254ef\") " pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.237126 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.250253 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.251299 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.254947 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xg88r" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.260441 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.261355 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.273221 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.273511 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q85k7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.283286 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.285916 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxk6t\" (UniqueName: \"kubernetes.io/projected/784a682d-1749-4399-a1f4-1e8bee7968ce-kube-api-access-gxk6t\") pod \"barbican-operator-controller-manager-9889b4756-lsslv\" (UID: \"784a682d-1749-4399-a1f4-1e8bee7968ce\") " pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.286456 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jql\" (UniqueName: \"kubernetes.io/projected/6e780a91-140a-4b7b-9748-c3a6c3b954e1-kube-api-access-d6jql\") pod \"designate-operator-controller-manager-b554678df-6trss\" (UID: \"6e780a91-140a-4b7b-9748-c3a6c3b954e1\") " pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.298276 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwzg\" (UniqueName: \"kubernetes.io/projected/250d2934-5f6e-4d4f-96d9-ec258c71909e-kube-api-access-ncwzg\") pod \"cinder-operator-controller-manager-57f4dc9749-rf6b7\" (UID: \"250d2934-5f6e-4d4f-96d9-ec258c71909e\") " pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.300911 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.302124 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.307008 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-v5wpv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.316801 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.322410 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.323294 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.330197 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cvz4n" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331124 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2gk\" (UniqueName: \"kubernetes.io/projected/4c379ff5-1113-4698-a898-9c1cb29000cf-kube-api-access-mx2gk\") pod \"ironic-operator-controller-manager-68f4bbb747-nfmz2\" (UID: \"4c379ff5-1113-4698-a898-9c1cb29000cf\") " pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331173 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331212 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpk75\" (UniqueName: \"kubernetes.io/projected/1295c691-04d0-4e6e-a4e6-4f85c6715964-kube-api-access-rpk75\") pod \"manila-operator-controller-manager-75b8755b74-q5plz\" (UID: \"1295c691-04d0-4e6e-a4e6-4f85c6715964\") " pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331229 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxz77\" (UniqueName: \"kubernetes.io/projected/43453734-49dd-48b0-86b4-46b20966f2f5-kube-api-access-pxz77\") pod \"keystone-operator-controller-manager-5ccbd96f89-hrh2h\" (UID: \"43453734-49dd-48b0-86b4-46b20966f2f5\") " pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331253 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpshg\" (UniqueName: \"kubernetes.io/projected/7d3873c8-7bab-42f8-918a-344d87eacce9-kube-api-access-zpshg\") pod \"glance-operator-controller-manager-97b97479c-jw8m6\" (UID: \"7d3873c8-7bab-42f8-918a-344d87eacce9\") " pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331269 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6jz\" (UniqueName: \"kubernetes.io/projected/ea63dc95-4a48-4ed4-b990-c6990bbe3d33-kube-api-access-kl6jz\") pod \"heat-operator-controller-manager-5486f4b54f-xdn4k\" (UID: \"ea63dc95-4a48-4ed4-b990-c6990bbe3d33\") " pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r4gs\" (UniqueName: \"kubernetes.io/projected/5c09333e-da20-4f48-96b9-29021e93149b-kube-api-access-2r4gs\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.331338 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h69g\" (UniqueName: \"kubernetes.io/projected/297946dc-5d6d-4389-bff3-3044865254ef-kube-api-access-4h69g\") pod \"horizon-operator-controller-manager-7777cf768b-bm84t\" (UID: \"297946dc-5d6d-4389-bff3-3044865254ef\") " pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.335974 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.348673 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6jz\" (UniqueName: \"kubernetes.io/projected/ea63dc95-4a48-4ed4-b990-c6990bbe3d33-kube-api-access-kl6jz\") pod \"heat-operator-controller-manager-5486f4b54f-xdn4k\" (UID: \"ea63dc95-4a48-4ed4-b990-c6990bbe3d33\") " pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.364479 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.372235 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.373291 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpshg\" (UniqueName: \"kubernetes.io/projected/7d3873c8-7bab-42f8-918a-344d87eacce9-kube-api-access-zpshg\") pod \"glance-operator-controller-manager-97b97479c-jw8m6\" (UID: \"7d3873c8-7bab-42f8-918a-344d87eacce9\") " pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.387087 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.398367 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.402096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h69g\" (UniqueName: \"kubernetes.io/projected/297946dc-5d6d-4389-bff3-3044865254ef-kube-api-access-4h69g\") pod \"horizon-operator-controller-manager-7777cf768b-bm84t\" (UID: \"297946dc-5d6d-4389-bff3-3044865254ef\") " pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.402385 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.412971 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jc9q7" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.423808 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.424733 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.428803 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tfwcw" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.434450 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxz77\" (UniqueName: \"kubernetes.io/projected/43453734-49dd-48b0-86b4-46b20966f2f5-kube-api-access-pxz77\") pod \"keystone-operator-controller-manager-5ccbd96f89-hrh2h\" (UID: \"43453734-49dd-48b0-86b4-46b20966f2f5\") " pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.434477 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpk75\" (UniqueName: \"kubernetes.io/projected/1295c691-04d0-4e6e-a4e6-4f85c6715964-kube-api-access-rpk75\") pod \"manila-operator-controller-manager-75b8755b74-q5plz\" (UID: \"1295c691-04d0-4e6e-a4e6-4f85c6715964\") " pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.434523 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r4gs\" (UniqueName: \"kubernetes.io/projected/5c09333e-da20-4f48-96b9-29021e93149b-kube-api-access-2r4gs\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.434570 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2gk\" (UniqueName: \"kubernetes.io/projected/4c379ff5-1113-4698-a898-9c1cb29000cf-kube-api-access-mx2gk\") pod \"ironic-operator-controller-manager-68f4bbb747-nfmz2\" (UID: \"4c379ff5-1113-4698-a898-9c1cb29000cf\") " pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.434592 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: E0613 05:02:21.434749 4894 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jun 13 05:02:21 crc kubenswrapper[4894]: E0613 05:02:21.434804 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert podName:5c09333e-da20-4f48-96b9-29021e93149b nodeName:}" failed. No retries permitted until 2025-06-13 05:02:21.934789124 +0000 UTC m=+700.381036587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert") pod "infra-operator-controller-manager-5b4ccb8c4-2mcf5" (UID: "5c09333e-da20-4f48-96b9-29021e93149b") : secret "infra-operator-webhook-server-cert" not found Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.451258 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.466536 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.471069 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.479945 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxz77\" (UniqueName: \"kubernetes.io/projected/43453734-49dd-48b0-86b4-46b20966f2f5-kube-api-access-pxz77\") pod \"keystone-operator-controller-manager-5ccbd96f89-hrh2h\" (UID: \"43453734-49dd-48b0-86b4-46b20966f2f5\") " pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.479962 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2gk\" (UniqueName: \"kubernetes.io/projected/4c379ff5-1113-4698-a898-9c1cb29000cf-kube-api-access-mx2gk\") pod \"ironic-operator-controller-manager-68f4bbb747-nfmz2\" (UID: \"4c379ff5-1113-4698-a898-9c1cb29000cf\") " pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.493980 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.495274 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpk75\" (UniqueName: \"kubernetes.io/projected/1295c691-04d0-4e6e-a4e6-4f85c6715964-kube-api-access-rpk75\") pod \"manila-operator-controller-manager-75b8755b74-q5plz\" (UID: \"1295c691-04d0-4e6e-a4e6-4f85c6715964\") " pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.506345 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r4gs\" (UniqueName: \"kubernetes.io/projected/5c09333e-da20-4f48-96b9-29021e93149b-kube-api-access-2r4gs\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.510731 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.511744 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.516876 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8cbhb" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.522811 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.523342 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.527640 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.534624 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.535627 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtstb\" (UniqueName: \"kubernetes.io/projected/6f399f41-0f28-471b-be85-3468ff990e9d-kube-api-access-jtstb\") pod \"mariadb-operator-controller-manager-7d4bbc7f54-r57lj\" (UID: \"6f399f41-0f28-471b-be85-3468ff990e9d\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.535647 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsrb\" (UniqueName: \"kubernetes.io/projected/c185ce61-38da-4eec-ab4d-4e73fbd9a957-kube-api-access-wdsrb\") pod \"neutron-operator-controller-manager-5df6744645-ll2wl\" (UID: \"c185ce61-38da-4eec-ab4d-4e73fbd9a957\") " pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.538435 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-v7m5h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.543346 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.580712 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.581014 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.588715 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.589718 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.593920 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.593937 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wvt6s" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.621717 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.630739 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.696582 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-58f798889d-2n26t"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.638044 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtstb\" (UniqueName: \"kubernetes.io/projected/6f399f41-0f28-471b-be85-3468ff990e9d-kube-api-access-jtstb\") pod \"mariadb-operator-controller-manager-7d4bbc7f54-r57lj\" (UID: \"6f399f41-0f28-471b-be85-3468ff990e9d\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.697890 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsrb\" (UniqueName: \"kubernetes.io/projected/c185ce61-38da-4eec-ab4d-4e73fbd9a957-kube-api-access-wdsrb\") pod \"neutron-operator-controller-manager-5df6744645-ll2wl\" (UID: \"c185ce61-38da-4eec-ab4d-4e73fbd9a957\") " pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.698057 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krk7r\" (UniqueName: \"kubernetes.io/projected/99063b46-9295-41e6-8ad6-5e6cefce2931-kube-api-access-krk7r\") pod \"nova-operator-controller-manager-664db87fd8-m64zp\" (UID: \"99063b46-9295-41e6-8ad6-5e6cefce2931\") " pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.698149 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqsv\" (UniqueName: \"kubernetes.io/projected/4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0-kube-api-access-6nqsv\") pod \"octavia-operator-controller-manager-857f9d6b88-pt7m6\" (UID: \"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0\") " pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.700716 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.703480 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.710371 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bztnq" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.714065 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.716488 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gww2b" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.721901 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtstb\" (UniqueName: \"kubernetes.io/projected/6f399f41-0f28-471b-be85-3468ff990e9d-kube-api-access-jtstb\") pod \"mariadb-operator-controller-manager-7d4bbc7f54-r57lj\" (UID: \"6f399f41-0f28-471b-be85-3468ff990e9d\") " pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.727754 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsrb\" (UniqueName: \"kubernetes.io/projected/c185ce61-38da-4eec-ab4d-4e73fbd9a957-kube-api-access-wdsrb\") pod \"neutron-operator-controller-manager-5df6744645-ll2wl\" (UID: \"c185ce61-38da-4eec-ab4d-4e73fbd9a957\") " pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.753177 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.760358 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.772177 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.776873 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.807625 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809032 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809086 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swwhd\" (UniqueName: \"kubernetes.io/projected/292d16cc-5623-4aa8-a644-2e69a901ca6f-kube-api-access-swwhd\") pod \"placement-operator-controller-manager-58f798889d-2n26t\" (UID: \"292d16cc-5623-4aa8-a644-2e69a901ca6f\") " pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809141 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpkd9\" (UniqueName: \"kubernetes.io/projected/6d32b2ff-59b7-4326-94e9-69e0fbd6ce34-kube-api-access-tpkd9\") pod \"ovn-operator-controller-manager-9f78645d5-s9r55\" (UID: \"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34\") " pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809178 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnrb\" (UniqueName: \"kubernetes.io/projected/bad848ff-73e6-4dad-a141-feac145e5c38-kube-api-access-vcnrb\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809207 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krk7r\" (UniqueName: \"kubernetes.io/projected/99063b46-9295-41e6-8ad6-5e6cefce2931-kube-api-access-krk7r\") pod \"nova-operator-controller-manager-664db87fd8-m64zp\" (UID: \"99063b46-9295-41e6-8ad6-5e6cefce2931\") " pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.809241 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqsv\" (UniqueName: \"kubernetes.io/projected/4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0-kube-api-access-6nqsv\") pod \"octavia-operator-controller-manager-857f9d6b88-pt7m6\" (UID: \"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0\") " pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.822233 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jhjd9" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.824854 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-58f798889d-2n26t"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.828642 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.835341 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.838680 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.844354 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.848302 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.853789 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9978g" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.859877 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqsv\" (UniqueName: \"kubernetes.io/projected/4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0-kube-api-access-6nqsv\") pod \"octavia-operator-controller-manager-857f9d6b88-pt7m6\" (UID: \"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0\") " pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.862871 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krk7r\" (UniqueName: \"kubernetes.io/projected/99063b46-9295-41e6-8ad6-5e6cefce2931-kube-api-access-krk7r\") pod \"nova-operator-controller-manager-664db87fd8-m64zp\" (UID: \"99063b46-9295-41e6-8ad6-5e6cefce2931\") " pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.871838 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.872971 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.877220 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-dkkmz" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.877623 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.888286 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc"] Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.919812 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jql6x\" (UniqueName: \"kubernetes.io/projected/29f79bd1-5c08-4435-8024-0a136c6b9337-kube-api-access-jql6x\") pod \"test-operator-controller-manager-6db7bffb67-rnhvc\" (UID: \"29f79bd1-5c08-4435-8024-0a136c6b9337\") " pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.919960 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swwhd\" (UniqueName: \"kubernetes.io/projected/292d16cc-5623-4aa8-a644-2e69a901ca6f-kube-api-access-swwhd\") pod \"placement-operator-controller-manager-58f798889d-2n26t\" (UID: \"292d16cc-5623-4aa8-a644-2e69a901ca6f\") " pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.920047 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpkd9\" (UniqueName: \"kubernetes.io/projected/6d32b2ff-59b7-4326-94e9-69e0fbd6ce34-kube-api-access-tpkd9\") pod \"ovn-operator-controller-manager-9f78645d5-s9r55\" (UID: \"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34\") " pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.920127 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnrb\" (UniqueName: \"kubernetes.io/projected/bad848ff-73e6-4dad-a141-feac145e5c38-kube-api-access-vcnrb\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.920226 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zstj\" (UniqueName: \"kubernetes.io/projected/898d7bc9-6d9c-4e81-b72e-fdb6f7440b43-kube-api-access-5zstj\") pod \"swift-operator-controller-manager-7779c57cf7-7zldr\" (UID: \"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43\") " pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.920302 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mmwp\" (UniqueName: \"kubernetes.io/projected/de71738a-f07f-49c4-9820-1480db37be05-kube-api-access-8mmwp\") pod \"telemetry-operator-controller-manager-884d667-sk2l9\" (UID: \"de71738a-f07f-49c4-9820-1480db37be05\") " pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.920371 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:21 crc kubenswrapper[4894]: E0613 05:02:21.920514 4894 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jun 13 05:02:21 crc kubenswrapper[4894]: E0613 05:02:21.920604 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert podName:bad848ff-73e6-4dad-a141-feac145e5c38 nodeName:}" failed. No retries permitted until 2025-06-13 05:02:22.420591375 +0000 UTC m=+700.866838838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert") pod "openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" (UID: "bad848ff-73e6-4dad-a141-feac145e5c38") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.923010 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.960802 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpkd9\" (UniqueName: \"kubernetes.io/projected/6d32b2ff-59b7-4326-94e9-69e0fbd6ce34-kube-api-access-tpkd9\") pod \"ovn-operator-controller-manager-9f78645d5-s9r55\" (UID: \"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34\") " pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.961703 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swwhd\" (UniqueName: \"kubernetes.io/projected/292d16cc-5623-4aa8-a644-2e69a901ca6f-kube-api-access-swwhd\") pod \"placement-operator-controller-manager-58f798889d-2n26t\" (UID: \"292d16cc-5623-4aa8-a644-2e69a901ca6f\") " pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:02:21 crc kubenswrapper[4894]: I0613 05:02:21.962340 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnrb\" (UniqueName: \"kubernetes.io/projected/bad848ff-73e6-4dad-a141-feac145e5c38-kube-api-access-vcnrb\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.000734 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.001896 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.004852 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9wl6p" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.005163 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.011234 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.023678 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030178 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030218 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zstj\" (UniqueName: \"kubernetes.io/projected/898d7bc9-6d9c-4e81-b72e-fdb6f7440b43-kube-api-access-5zstj\") pod \"swift-operator-controller-manager-7779c57cf7-7zldr\" (UID: \"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43\") " pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030241 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mmwp\" (UniqueName: \"kubernetes.io/projected/de71738a-f07f-49c4-9820-1480db37be05-kube-api-access-8mmwp\") pod \"telemetry-operator-controller-manager-884d667-sk2l9\" (UID: \"de71738a-f07f-49c4-9820-1480db37be05\") " pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030265 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7ll\" (UniqueName: \"kubernetes.io/projected/2b3aee0a-6aa2-494d-8eae-5f97d5954868-kube-api-access-wc7ll\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030298 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jql6x\" (UniqueName: \"kubernetes.io/projected/29f79bd1-5c08-4435-8024-0a136c6b9337-kube-api-access-jql6x\") pod \"test-operator-controller-manager-6db7bffb67-rnhvc\" (UID: \"29f79bd1-5c08-4435-8024-0a136c6b9337\") " pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.030325 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.033856 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.035395 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.047199 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.047237 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.047791 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8v57s" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.060557 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c09333e-da20-4f48-96b9-29021e93149b-cert\") pod \"infra-operator-controller-manager-5b4ccb8c4-2mcf5\" (UID: \"5c09333e-da20-4f48-96b9-29021e93149b\") " pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.064988 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mmwp\" (UniqueName: \"kubernetes.io/projected/de71738a-f07f-49c4-9820-1480db37be05-kube-api-access-8mmwp\") pod \"telemetry-operator-controller-manager-884d667-sk2l9\" (UID: \"de71738a-f07f-49c4-9820-1480db37be05\") " pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.065284 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jql6x\" (UniqueName: \"kubernetes.io/projected/29f79bd1-5c08-4435-8024-0a136c6b9337-kube-api-access-jql6x\") pod \"test-operator-controller-manager-6db7bffb67-rnhvc\" (UID: \"29f79bd1-5c08-4435-8024-0a136c6b9337\") " pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.072569 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zstj\" (UniqueName: \"kubernetes.io/projected/898d7bc9-6d9c-4e81-b72e-fdb6f7440b43-kube-api-access-5zstj\") pod \"swift-operator-controller-manager-7779c57cf7-7zldr\" (UID: \"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43\") " pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.078181 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.078793 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:22 crc kubenswrapper[4894]: E0613 05:02:22.133794 4894 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jun 13 05:02:22 crc kubenswrapper[4894]: E0613 05:02:22.134083 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert podName:2b3aee0a-6aa2-494d-8eae-5f97d5954868 nodeName:}" failed. No retries permitted until 2025-06-13 05:02:22.634060176 +0000 UTC m=+701.080307639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert") pod "openstack-operator-controller-manager-74d9b8b9f5-cj7hp" (UID: "2b3aee0a-6aa2-494d-8eae-5f97d5954868") : secret "webhook-server-cert" not found Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.134964 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" event={"ID":"784a682d-1749-4399-a1f4-1e8bee7968ce","Type":"ContainerStarted","Data":"cf77d59d0f902736f876d0badd4e404756769d594e5b3c97094c37390d01797b"} Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.131653 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.145161 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7ll\" (UniqueName: \"kubernetes.io/projected/2b3aee0a-6aa2-494d-8eae-5f97d5954868-kube-api-access-wc7ll\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.145231 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8c5\" (UniqueName: \"kubernetes.io/projected/4f86499e-2447-4489-89d5-1777e4d445c6-kube-api-access-fl8c5\") pod \"rabbitmq-cluster-operator-manager-67ff8584d-fzgb7\" (UID: \"4f86499e-2447-4489-89d5-1777e4d445c6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.151532 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.192651 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7ll\" (UniqueName: \"kubernetes.io/projected/2b3aee0a-6aa2-494d-8eae-5f97d5954868-kube-api-access-wc7ll\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.196113 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.227981 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.248986 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8c5\" (UniqueName: \"kubernetes.io/projected/4f86499e-2447-4489-89d5-1777e4d445c6-kube-api-access-fl8c5\") pod \"rabbitmq-cluster-operator-manager-67ff8584d-fzgb7\" (UID: \"4f86499e-2447-4489-89d5-1777e4d445c6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.249456 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.270289 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b554678df-6trss"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.297410 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8c5\" (UniqueName: \"kubernetes.io/projected/4f86499e-2447-4489-89d5-1777e4d445c6-kube-api-access-fl8c5\") pod \"rabbitmq-cluster-operator-manager-67ff8584d-fzgb7\" (UID: \"4f86499e-2447-4489-89d5-1777e4d445c6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.431435 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.478700 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.480131 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:22 crc kubenswrapper[4894]: E0613 05:02:22.481359 4894 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jun 13 05:02:22 crc kubenswrapper[4894]: E0613 05:02:22.481601 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert podName:bad848ff-73e6-4dad-a141-feac145e5c38 nodeName:}" failed. No retries permitted until 2025-06-13 05:02:23.481587443 +0000 UTC m=+701.927834896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert") pod "openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" (UID: "bad848ff-73e6-4dad-a141-feac145e5c38") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.550608 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.682315 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.701015 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b3aee0a-6aa2-494d-8eae-5f97d5954868-cert\") pod \"openstack-operator-controller-manager-74d9b8b9f5-cj7hp\" (UID: \"2b3aee0a-6aa2-494d-8eae-5f97d5954868\") " pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.752128 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.846196 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.852369 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2"] Jun 13 05:02:22 crc kubenswrapper[4894]: I0613 05:02:22.860699 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t"] Jun 13 05:02:22 crc kubenswrapper[4894]: W0613 05:02:22.863039 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c379ff5_1113_4698_a898_9c1cb29000cf.slice/crio-1479ee79f6678158fa1ec36ce8cd31c5a7258a5b4d6384f6b591c5fbd76cecd4 WatchSource:0}: Error finding container 1479ee79f6678158fa1ec36ce8cd31c5a7258a5b4d6384f6b591c5fbd76cecd4: Status 404 returned error can't find the container with id 1479ee79f6678158fa1ec36ce8cd31c5a7258a5b4d6384f6b591c5fbd76cecd4 Jun 13 05:02:22 crc kubenswrapper[4894]: W0613 05:02:22.869573 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43453734_49dd_48b0_86b4_46b20966f2f5.slice/crio-efd69888c608545978c2b409c96418fb91a6757065bff9740be96faaef99e75c WatchSource:0}: Error finding container efd69888c608545978c2b409c96418fb91a6757065bff9740be96faaef99e75c: Status 404 returned error can't find the container with id efd69888c608545978c2b409c96418fb91a6757065bff9740be96faaef99e75c Jun 13 05:02:22 crc kubenswrapper[4894]: W0613 05:02:22.882838 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297946dc_5d6d_4389_bff3_3044865254ef.slice/crio-e343fc5625fc8cfcf71439c1cbc970050a78122fd2747834ba9e45b7fbf92ea3 WatchSource:0}: Error finding container e343fc5625fc8cfcf71439c1cbc970050a78122fd2747834ba9e45b7fbf92ea3: Status 404 returned error can't find the container with id e343fc5625fc8cfcf71439c1cbc970050a78122fd2747834ba9e45b7fbf92ea3 Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.063552 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.073753 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz"] Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.073865 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f399f41_0f28_471b_be85_3468ff990e9d.slice/crio-ce277d768d24e8ce3e6753c80ac7a2f3b6162e6ee80e4d9fa8595142e1587f37 WatchSource:0}: Error finding container ce277d768d24e8ce3e6753c80ac7a2f3b6162e6ee80e4d9fa8595142e1587f37: Status 404 returned error can't find the container with id ce277d768d24e8ce3e6753c80ac7a2f3b6162e6ee80e4d9fa8595142e1587f37 Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.077159 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbe8d8f_e512_4ff2_8128_f9cbe4e070a0.slice/crio-647093f7ba5d3bf6886d251f6310ea93dea8f075234abd41e228596886c29306 WatchSource:0}: Error finding container 647093f7ba5d3bf6886d251f6310ea93dea8f075234abd41e228596886c29306: Status 404 returned error can't find the container with id 647093f7ba5d3bf6886d251f6310ea93dea8f075234abd41e228596886c29306 Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.083119 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6"] Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.092532 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292d16cc_5623_4aa8_a644_2e69a901ca6f.slice/crio-b0508f6eb3170c0d544023a263a9bdc980b3a818a8016e324aa18ed1c1bb20f2 WatchSource:0}: Error finding container b0508f6eb3170c0d544023a263a9bdc980b3a818a8016e324aa18ed1c1bb20f2: Status 404 returned error can't find the container with id b0508f6eb3170c0d544023a263a9bdc980b3a818a8016e324aa18ed1c1bb20f2 Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.102441 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-58f798889d-2n26t"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.141836 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" event={"ID":"1295c691-04d0-4e6e-a4e6-4f85c6715964","Type":"ContainerStarted","Data":"557e4da1e7f4826d3c70ff2bd5c40b5683725fd5ef72eefce639b1f697097096"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.143168 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" event={"ID":"4c379ff5-1113-4698-a898-9c1cb29000cf","Type":"ContainerStarted","Data":"1479ee79f6678158fa1ec36ce8cd31c5a7258a5b4d6384f6b591c5fbd76cecd4"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.145620 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" event={"ID":"250d2934-5f6e-4d4f-96d9-ec258c71909e","Type":"ContainerStarted","Data":"9807e58589c45c5e17322f55d507954e82242782cb99d126767fcee5048c9af0"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.146848 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" event={"ID":"43453734-49dd-48b0-86b4-46b20966f2f5","Type":"ContainerStarted","Data":"efd69888c608545978c2b409c96418fb91a6757065bff9740be96faaef99e75c"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.147901 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" event={"ID":"6f399f41-0f28-471b-be85-3468ff990e9d","Type":"ContainerStarted","Data":"ce277d768d24e8ce3e6753c80ac7a2f3b6162e6ee80e4d9fa8595142e1587f37"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.149514 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" event={"ID":"ea63dc95-4a48-4ed4-b990-c6990bbe3d33","Type":"ContainerStarted","Data":"e5aa03dbfd79db1b277fbb9aa0ab60ec6f88f4894144d20c3bc981a1096260ed"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.150838 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" event={"ID":"7d3873c8-7bab-42f8-918a-344d87eacce9","Type":"ContainerStarted","Data":"46a7e899bc71c91ba795b29e470b806780f59030e295aac58be2a8bef1c5a986"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.152804 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" event={"ID":"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0","Type":"ContainerStarted","Data":"647093f7ba5d3bf6886d251f6310ea93dea8f075234abd41e228596886c29306"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.153798 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" event={"ID":"297946dc-5d6d-4389-bff3-3044865254ef","Type":"ContainerStarted","Data":"e343fc5625fc8cfcf71439c1cbc970050a78122fd2747834ba9e45b7fbf92ea3"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.156336 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" event={"ID":"292d16cc-5623-4aa8-a644-2e69a901ca6f","Type":"ContainerStarted","Data":"b0508f6eb3170c0d544023a263a9bdc980b3a818a8016e324aa18ed1c1bb20f2"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.157901 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" event={"ID":"6e780a91-140a-4b7b-9748-c3a6c3b954e1","Type":"ContainerStarted","Data":"64b9fbd407cd47461ee1959c75f3f89bb176f8fcc48e6d1d24fc5a4990793062"} Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.487216 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.500900 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.504397 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.514547 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.520343 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad848ff-73e6-4dad-a141-feac145e5c38-cert\") pod \"openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt\" (UID: \"bad848ff-73e6-4dad-a141-feac145e5c38\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.549554 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.557134 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.574734 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.585904 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.593085 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9"] Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.596641 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp"] Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.608359 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f79bd1_5c08_4435_8024_0a136c6b9337.slice/crio-6397c6056c9e807ba15e22a45cc50fa8cdb7935caca666b60c0893e33036c368 WatchSource:0}: Error finding container 6397c6056c9e807ba15e22a45cc50fa8cdb7935caca666b60c0893e33036c368: Status 404 returned error can't find the container with id 6397c6056c9e807ba15e22a45cc50fa8cdb7935caca666b60c0893e33036c368 Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.617008 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f86499e_2447_4489_89d5_1777e4d445c6.slice/crio-55d6ab00c209937930149eb0c47ed6a41e0bebd65d64fff36ac497663e8de6d2 WatchSource:0}: Error finding container 55d6ab00c209937930149eb0c47ed6a41e0bebd65d64fff36ac497663e8de6d2: Status 404 returned error can't find the container with id 55d6ab00c209937930149eb0c47ed6a41e0bebd65d64fff36ac497663e8de6d2 Jun 13 05:02:23 crc kubenswrapper[4894]: E0613 05:02:23.624019 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fl8c5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-67ff8584d-fzgb7_openstack-operators(4f86499e-2447-4489-89d5-1777e4d445c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jun 13 05:02:23 crc kubenswrapper[4894]: W0613 05:02:23.624377 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3aee0a_6aa2_494d_8eae_5f97d5954868.slice/crio-f9e471df7d4b65167d7dff380f2e338b5090f1454e6f4a92b2ee7263c31c76f8 WatchSource:0}: Error finding container f9e471df7d4b65167d7dff380f2e338b5090f1454e6f4a92b2ee7263c31c76f8: Status 404 returned error can't find the container with id f9e471df7d4b65167d7dff380f2e338b5090f1454e6f4a92b2ee7263c31c76f8 Jun 13 05:02:23 crc kubenswrapper[4894]: E0613 05:02:23.625459 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" podUID="4f86499e-2447-4489-89d5-1777e4d445c6" Jun 13 05:02:23 crc kubenswrapper[4894]: E0613 05:02:23.649347 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:5d237421ae87d4a765a6ba8e4ab6e82e2fc082f2bf900174be343710e043ba2a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jql6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6db7bffb67-rnhvc_openstack-operators(29f79bd1-5c08-4435-8024-0a136c6b9337): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jun 13 05:02:23 crc kubenswrapper[4894]: E0613 05:02:23.649451 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:da08b3c2e379399cdab66cf8c5248ed8bf9783745cf4d68a6c9efb0c30838a0a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdsrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5df6744645-ll2wl_openstack-operators(c185ce61-38da-4eec-ab4d-4e73fbd9a957): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jun 13 05:02:23 crc kubenswrapper[4894]: I0613 05:02:23.800757 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.183081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" event={"ID":"5c09333e-da20-4f48-96b9-29021e93149b","Type":"ContainerStarted","Data":"f3b77fdb5f99a5125e61c5b103fe1e9fe69b732bee6cd5dad0b4956bee2ebb27"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.196483 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" event={"ID":"de71738a-f07f-49c4-9820-1480db37be05","Type":"ContainerStarted","Data":"a8fd6cfb51a3d9541632b394c7f788eabfab19f7a8098ae899fd10a8c35ee391"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.211062 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" event={"ID":"4f86499e-2447-4489-89d5-1777e4d445c6","Type":"ContainerStarted","Data":"55d6ab00c209937930149eb0c47ed6a41e0bebd65d64fff36ac497663e8de6d2"} Jun 13 05:02:24 crc kubenswrapper[4894]: E0613 05:02:24.227251 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" podUID="4f86499e-2447-4489-89d5-1777e4d445c6" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.241334 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" event={"ID":"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34","Type":"ContainerStarted","Data":"07095beea7cfd67aab189199db6a13138fa09dcea39f3089aac93ddd4f5d28ff"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.245402 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" event={"ID":"2b3aee0a-6aa2-494d-8eae-5f97d5954868","Type":"ContainerStarted","Data":"f9e471df7d4b65167d7dff380f2e338b5090f1454e6f4a92b2ee7263c31c76f8"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.248553 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" event={"ID":"c185ce61-38da-4eec-ab4d-4e73fbd9a957","Type":"ContainerStarted","Data":"a95d6dfddb7db85473d9302f5e1d776f88acd7e1ba9759bdfc1fe480e44d68cf"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.251152 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" event={"ID":"99063b46-9295-41e6-8ad6-5e6cefce2931","Type":"ContainerStarted","Data":"b4fd236c9609aa2c666440ee7f592edcb4e96e780c3162770b7fa3c1b63ce784"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.252075 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" event={"ID":"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43","Type":"ContainerStarted","Data":"7e68ee55732883a5c96345ebe0376715b872b52f0b0d28fdfb122993c66af8d9"} Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.253367 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" event={"ID":"29f79bd1-5c08-4435-8024-0a136c6b9337","Type":"ContainerStarted","Data":"6397c6056c9e807ba15e22a45cc50fa8cdb7935caca666b60c0893e33036c368"} Jun 13 05:02:24 crc kubenswrapper[4894]: E0613 05:02:24.261307 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" podUID="c185ce61-38da-4eec-ab4d-4e73fbd9a957" Jun 13 05:02:24 crc kubenswrapper[4894]: E0613 05:02:24.346617 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" podUID="29f79bd1-5c08-4435-8024-0a136c6b9337" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.350386 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.351648 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.357388 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.548897 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqxgq\" (UniqueName: \"kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.548938 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.548991 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.653451 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxgq\" (UniqueName: \"kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.653809 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.653862 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.654327 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.654361 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.685517 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxgq\" (UniqueName: \"kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq\") pod \"community-operators-l2j25\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.769958 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:24 crc kubenswrapper[4894]: I0613 05:02:24.774030 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt"] Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.308314 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" event={"ID":"29f79bd1-5c08-4435-8024-0a136c6b9337","Type":"ContainerStarted","Data":"0053b316013bc275ae3eb5d9958b7657b1b1bb9584d84e7dcfc8dc60dd6fd02f"} Jun 13 05:02:25 crc kubenswrapper[4894]: E0613 05:02:25.334453 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:5d237421ae87d4a765a6ba8e4ab6e82e2fc082f2bf900174be343710e043ba2a\\\"\"" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" podUID="29f79bd1-5c08-4435-8024-0a136c6b9337" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.378512 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" event={"ID":"2b3aee0a-6aa2-494d-8eae-5f97d5954868","Type":"ContainerStarted","Data":"474e2c4bde6cab17e1c0b8b1d6c8b4e821d7be027005e776d9f12c75c34377cd"} Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.378552 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" event={"ID":"2b3aee0a-6aa2-494d-8eae-5f97d5954868","Type":"ContainerStarted","Data":"35253aa106cc2f56be702e6f8dbc1d26d52c13e5a7942b6d8db2e76385addb64"} Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.379317 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.396313 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" event={"ID":"c185ce61-38da-4eec-ab4d-4e73fbd9a957","Type":"ContainerStarted","Data":"fdedbfe6c3f37d639c5a99ece2917245420d8c03152908a478dce689b8573c62"} Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.411080 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" podStartSLOduration=4.411060809 podStartE2EDuration="4.411060809s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:02:25.402130678 +0000 UTC m=+703.848378141" watchObservedRunningTime="2025-06-13 05:02:25.411060809 +0000 UTC m=+703.857308272" Jun 13 05:02:25 crc kubenswrapper[4894]: E0613 05:02:25.413816 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:da08b3c2e379399cdab66cf8c5248ed8bf9783745cf4d68a6c9efb0c30838a0a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" podUID="c185ce61-38da-4eec-ab4d-4e73fbd9a957" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.414044 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" event={"ID":"bad848ff-73e6-4dad-a141-feac145e5c38","Type":"ContainerStarted","Data":"3a26d59bf383a0caa59b620ac065b79f365f2652b2ed262ebcb5cb9337b65fce"} Jun 13 05:02:25 crc kubenswrapper[4894]: E0613 05:02:25.423502 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" podUID="4f86499e-2447-4489-89d5-1777e4d445c6" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.445878 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.445912 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.634390 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:02:25 crc kubenswrapper[4894]: I0613 05:02:25.976174 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.236210 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.236273 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.429910 4894 generic.go:334] "Generic (PLEG): container finished" podID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerID="17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471" exitCode=0 Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.430039 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerDied","Data":"17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471"} Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.430103 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerStarted","Data":"c84c8fc1d10bdca1608e2689982a91b546b83754faf63d83a064ac9b964e6dbc"} Jun 13 05:02:26 crc kubenswrapper[4894]: E0613 05:02:26.447882 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:5d237421ae87d4a765a6ba8e4ab6e82e2fc082f2bf900174be343710e043ba2a\\\"\"" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" podUID="29f79bd1-5c08-4435-8024-0a136c6b9337" Jun 13 05:02:26 crc kubenswrapper[4894]: E0613 05:02:26.447883 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:da08b3c2e379399cdab66cf8c5248ed8bf9783745cf4d68a6c9efb0c30838a0a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" podUID="c185ce61-38da-4eec-ab4d-4e73fbd9a957" Jun 13 05:02:26 crc kubenswrapper[4894]: I0613 05:02:26.641887 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:28 crc kubenswrapper[4894]: I0613 05:02:28.290876 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:28 crc kubenswrapper[4894]: I0613 05:02:28.482106 4894 generic.go:334] "Generic (PLEG): container finished" podID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerID="d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5" exitCode=0 Jun 13 05:02:28 crc kubenswrapper[4894]: I0613 05:02:28.482221 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerDied","Data":"d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5"} Jun 13 05:02:28 crc kubenswrapper[4894]: I0613 05:02:28.482399 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxld2" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="registry-server" containerID="cri-o://7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" gracePeriod=2 Jun 13 05:02:31 crc kubenswrapper[4894]: I0613 05:02:31.510293 4894 generic.go:334] "Generic (PLEG): container finished" podID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerID="7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" exitCode=0 Jun 13 05:02:31 crc kubenswrapper[4894]: I0613 05:02:31.510395 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerDied","Data":"7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc"} Jun 13 05:02:32 crc kubenswrapper[4894]: I0613 05:02:32.758273 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-74d9b8b9f5-cj7hp" Jun 13 05:02:35 crc kubenswrapper[4894]: E0613 05:02:35.447619 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc is running failed: container process not found" containerID="7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" cmd=["grpc_health_probe","-addr=:50051"] Jun 13 05:02:35 crc kubenswrapper[4894]: E0613 05:02:35.448876 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc is running failed: container process not found" containerID="7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" cmd=["grpc_health_probe","-addr=:50051"] Jun 13 05:02:35 crc kubenswrapper[4894]: E0613 05:02:35.449423 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc is running failed: container process not found" containerID="7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" cmd=["grpc_health_probe","-addr=:50051"] Jun 13 05:02:35 crc kubenswrapper[4894]: E0613 05:02:35.449481 4894 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-pxld2" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="registry-server" Jun 13 05:02:42 crc kubenswrapper[4894]: E0613 05:02:42.648588 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:47d1eacd07738b8dc59814467f756e12092d57c051b119be499f425ec738d607" Jun 13 05:02:42 crc kubenswrapper[4894]: E0613 05:02:42.649322 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:47d1eacd07738b8dc59814467f756e12092d57c051b119be499f425ec738d607,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mmwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-884d667-sk2l9_openstack-operators(de71738a-f07f-49c4-9820-1480db37be05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.107593 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:0e9cc04d1421ac129bbae99f4e8089a41cd6dcb768f3fbc14264a1b5968d8b60" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.107915 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:0e9cc04d1421ac129bbae99f4e8089a41cd6dcb768f3fbc14264a1b5968d8b60,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swwhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-58f798889d-2n26t_openstack-operators(292d16cc-5623-4aa8-a644-2e69a901ca6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.444926 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:852ad251fd02b5ffd89018239d0435d4e0ac8f7f372245d74f32ac5931bf6958" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.445295 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:852ad251fd02b5ffd89018239d0435d4e0ac8f7f372245d74f32ac5931bf6958,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nqsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-857f9d6b88-pt7m6_openstack-operators(4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.796437 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:fe55d3eff25aee0e0a215c41b84431ce206544d0d0071df708c94db32d06dbf5" Jun 13 05:02:43 crc kubenswrapper[4894]: E0613 05:02:43.796641 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:fe55d3eff25aee0e0a215c41b84431ce206544d0d0071df708c94db32d06dbf5,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d6jql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-b554678df-6trss_openstack-operators(6e780a91-140a-4b7b-9748-c3a6c3b954e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.244938 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:0dd27821f0cf9de77f407662d64c69dc9b0f22e944bfbf03cb4f8805dc9e21c9" Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.245104 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:0dd27821f0cf9de77f407662d64c69dc9b0f22e944bfbf03cb4f8805dc9e21c9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{1073741824 0} {} 1Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2r4gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-5b4ccb8c4-2mcf5_openstack-operators(5c09333e-da20-4f48-96b9-29021e93149b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.273170 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.322333 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities\") pod \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.322442 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crd8l\" (UniqueName: \"kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l\") pod \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.322504 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content\") pod \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\" (UID: \"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09\") " Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.323113 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities" (OuterVolumeSpecName: "utilities") pod "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" (UID: "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.330318 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l" (OuterVolumeSpecName: "kube-api-access-crd8l") pod "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" (UID: "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09"). InnerVolumeSpecName "kube-api-access-crd8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.330525 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" (UID: "930f2c7c-e3fb-4f72-8c7e-43853b1d2a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.423965 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.423997 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crd8l\" (UniqueName: \"kubernetes.io/projected/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-kube-api-access-crd8l\") on node \"crc\" DevicePath \"\"" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.424006 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.612626 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxld2" event={"ID":"930f2c7c-e3fb-4f72-8c7e-43853b1d2a09","Type":"ContainerDied","Data":"4e4a7e7216552f95b05d51bb4cfcd2ef634b82c63203ab689bff05d4970b7d5b"} Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.612693 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxld2" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.612699 4894 scope.go:117] "RemoveContainer" containerID="7545253a43d6a36783c044261bb56b2eaac9b88c94eec1406290782482d0afbc" Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.644521 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:44 crc kubenswrapper[4894]: I0613 05:02:44.648587 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxld2"] Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.720989 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:e6ff2e9067469a0ab6f3c88d670329e7d48d6e7b72a78d7c831bf25517adcacd" Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.721136 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:e6ff2e9067469a0ab6f3c88d670329e7d48d6e7b72a78d7c831bf25517adcacd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5zstj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7779c57cf7-7zldr_openstack-operators(898d7bc9-6d9c-4e81-b72e-fdb6f7440b43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.996761 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:376fa11edc40f68251027f0f30f9e0db312cc88fd9386985b26e357f6e487515" Jun 13 05:02:44 crc kubenswrapper[4894]: E0613 05:02:44.997127 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:376fa11edc40f68251027f0f30f9e0db312cc88fd9386985b26e357f6e487515,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h69g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-7777cf768b-bm84t_openstack-operators(297946dc-5d6d-4389-bff3-3044865254ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:45 crc kubenswrapper[4894]: E0613 05:02:45.381963 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:39b8745c08b17b80eb7b9418f4031e5231da803282dd4b63dfe13e26185c65da" Jun 13 05:02:45 crc kubenswrapper[4894]: E0613 05:02:45.383079 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:39b8745c08b17b80eb7b9418f4031e5231da803282dd4b63dfe13e26185c65da,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kl6jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5486f4b54f-xdn4k_openstack-operators(ea63dc95-4a48-4ed4-b990-c6990bbe3d33): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:45 crc kubenswrapper[4894]: E0613 05:02:45.726532 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:5b5b3f3998b077e7a0fc2e4fbbc304142a56d765c1989f2afbd435e2697af8f7" Jun 13 05:02:45 crc kubenswrapper[4894]: E0613 05:02:45.726969 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:5b5b3f3998b077e7a0fc2e4fbbc304142a56d765c1989f2afbd435e2697af8f7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gxk6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-9889b4756-lsslv_openstack-operators(784a682d-1749-4399-a1f4-1e8bee7968ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:46 crc kubenswrapper[4894]: E0613 05:02:46.030940 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:3758effecdae62091b67afe9ab364ea1e37018baadac101cd60f6529a7464139" Jun 13 05:02:46 crc kubenswrapper[4894]: E0613 05:02:46.031110 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:3758effecdae62091b67afe9ab364ea1e37018baadac101cd60f6529a7464139,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{268435456 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pxz77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5ccbd96f89-hrh2h_openstack-operators(43453734-49dd-48b0-86b4-46b20966f2f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:02:46 crc kubenswrapper[4894]: I0613 05:02:46.287543 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" path="/var/lib/kubelet/pods/930f2c7c-e3fb-4f72-8c7e-43853b1d2a09/volumes" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.517137 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:02:47 crc kubenswrapper[4894]: E0613 05:02:47.518730 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="extract-utilities" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.518744 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="extract-utilities" Jun 13 05:02:47 crc kubenswrapper[4894]: E0613 05:02:47.518778 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="registry-server" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.518784 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="registry-server" Jun 13 05:02:47 crc kubenswrapper[4894]: E0613 05:02:47.518795 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="extract-content" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.518801 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="extract-content" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.519027 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="930f2c7c-e3fb-4f72-8c7e-43853b1d2a09" containerName="registry-server" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.520430 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.538236 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.566331 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.566436 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbdm\" (UniqueName: \"kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.566564 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.668276 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.668581 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbdm\" (UniqueName: \"kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.668608 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.668757 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.668962 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.684737 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbdm\" (UniqueName: \"kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm\") pod \"redhat-operators-bl55k\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:47 crc kubenswrapper[4894]: I0613 05:02:47.855830 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:48 crc kubenswrapper[4894]: I0613 05:02:48.361094 4894 scope.go:117] "RemoveContainer" containerID="3eb157c277002d16cf91a0df424f82c71b76f02b0f68f2efe9c4b30452dc3178" Jun 13 05:02:48 crc kubenswrapper[4894]: I0613 05:02:48.519291 4894 scope.go:117] "RemoveContainer" containerID="a0db7f77e7b984c3186c196de6eb2026ef78c07dae7159f1922615e82f9a6a76" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.780994 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" podUID="6e780a91-140a-4b7b-9748-c3a6c3b954e1" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.783043 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" podUID="de71738a-f07f-49c4-9820-1480db37be05" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.802132 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" podUID="784a682d-1749-4399-a1f4-1e8bee7968ce" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.807083 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" podUID="4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.810721 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" podUID="5c09333e-da20-4f48-96b9-29021e93149b" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.835092 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" podUID="297946dc-5d6d-4389-bff3-3044865254ef" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.850321 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" podUID="898d7bc9-6d9c-4e81-b72e-fdb6f7440b43" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.878021 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" podUID="292d16cc-5623-4aa8-a644-2e69a901ca6f" Jun 13 05:02:48 crc kubenswrapper[4894]: E0613 05:02:48.897087 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" podUID="43453734-49dd-48b0-86b4-46b20966f2f5" Jun 13 05:02:48 crc kubenswrapper[4894]: I0613 05:02:48.978143 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:02:48 crc kubenswrapper[4894]: W0613 05:02:48.990831 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ce5fae_aa1c_478a_ab81_8043e7d3c3c3.slice/crio-d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75 WatchSource:0}: Error finding container d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75: Status 404 returned error can't find the container with id d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75 Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.238963 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" podUID="ea63dc95-4a48-4ed4-b990-c6990bbe3d33" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.650229 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" event={"ID":"250d2934-5f6e-4d4f-96d9-ec258c71909e","Type":"ContainerStarted","Data":"f29d5d23c6b9001e744a2e4adadd667f7f51760c0daa74e16968febe86cec6a4"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.651202 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" event={"ID":"784a682d-1749-4399-a1f4-1e8bee7968ce","Type":"ContainerStarted","Data":"0eff10e4e34fa19b0ff93ceb32278a1ad0b9635ba4856b66414c342d9a5c5d6f"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.652527 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:5b5b3f3998b077e7a0fc2e4fbbc304142a56d765c1989f2afbd435e2697af8f7\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" podUID="784a682d-1749-4399-a1f4-1e8bee7968ce" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.653766 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" event={"ID":"5c09333e-da20-4f48-96b9-29021e93149b","Type":"ContainerStarted","Data":"d91b2cde91b31b96941a77ec1cec2d66a420c2c2cb0d9cb1329c3b2a771798a6"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.654531 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:0dd27821f0cf9de77f407662d64c69dc9b0f22e944bfbf03cb4f8805dc9e21c9\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" podUID="5c09333e-da20-4f48-96b9-29021e93149b" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.655458 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" event={"ID":"de71738a-f07f-49c4-9820-1480db37be05","Type":"ContainerStarted","Data":"0fc73f0812522aa52635fce031cc80881b0102524a7f24db1d0f2c26dc664a8e"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.656489 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:47d1eacd07738b8dc59814467f756e12092d57c051b119be499f425ec738d607\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" podUID="de71738a-f07f-49c4-9820-1480db37be05" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.657424 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerStarted","Data":"d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.659327 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" event={"ID":"292d16cc-5623-4aa8-a644-2e69a901ca6f","Type":"ContainerStarted","Data":"7705efa3aeb4127f93ded18c97ab2b940e33cabe7009d7d54679113975d1f545"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.660815 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:0e9cc04d1421ac129bbae99f4e8089a41cd6dcb768f3fbc14264a1b5968d8b60\\\"\"" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" podUID="292d16cc-5623-4aa8-a644-2e69a901ca6f" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.661353 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" event={"ID":"43453734-49dd-48b0-86b4-46b20966f2f5","Type":"ContainerStarted","Data":"6569d41f4104ae4ea026d3c4e9026f4fe635d2b1b04079623d7e553aad99e110"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.662468 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" event={"ID":"4c379ff5-1113-4698-a898-9c1cb29000cf","Type":"ContainerStarted","Data":"2fe87bedb2d282eb2dd6da083f97e6a0ad9fda51bdcb6660f06c9cc2116195f6"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.662493 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3758effecdae62091b67afe9ab364ea1e37018baadac101cd60f6529a7464139\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" podUID="43453734-49dd-48b0-86b4-46b20966f2f5" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.664422 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerStarted","Data":"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.665921 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" event={"ID":"6e780a91-140a-4b7b-9748-c3a6c3b954e1","Type":"ContainerStarted","Data":"0e63b6f525e8c372d74b57a6a092665422cdcda65a7110710e0761954c269247"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.667378 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" event={"ID":"297946dc-5d6d-4389-bff3-3044865254ef","Type":"ContainerStarted","Data":"c4214bf2240e8a2a99e69a6308dcc7073859bf5136afa11ac3aaac6a36855bbf"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.667487 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:fe55d3eff25aee0e0a215c41b84431ce206544d0d0071df708c94db32d06dbf5\\\"\"" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" podUID="6e780a91-140a-4b7b-9748-c3a6c3b954e1" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.668480 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" event={"ID":"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43","Type":"ContainerStarted","Data":"f1f31f07ce4f1e7e6fabb0e2f560ed5c189df378bd3271669fddc6c6da9894ea"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.669148 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:376fa11edc40f68251027f0f30f9e0db312cc88fd9386985b26e357f6e487515\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" podUID="297946dc-5d6d-4389-bff3-3044865254ef" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.670129 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" event={"ID":"bad848ff-73e6-4dad-a141-feac145e5c38","Type":"ContainerStarted","Data":"a7ab0b8e2a6dd1c58cadd18573d9d6011c4e5e30c49757c54ee88f03e32dcce4"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.671494 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:e6ff2e9067469a0ab6f3c88d670329e7d48d6e7b72a78d7c831bf25517adcacd\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" podUID="898d7bc9-6d9c-4e81-b72e-fdb6f7440b43" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.672681 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" event={"ID":"29f79bd1-5c08-4435-8024-0a136c6b9337","Type":"ContainerStarted","Data":"1dc3be0b1546172ba12384fca9b10c97cc298bf70bf21a9ebe4ca8cf3f57d141"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.672861 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.674081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" event={"ID":"4f86499e-2447-4489-89d5-1777e4d445c6","Type":"ContainerStarted","Data":"1441f52a9efc52c2dcf9b951f7aed8d790279c9960392e2d28b37e93d6a9b8bb"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.675381 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" event={"ID":"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34","Type":"ContainerStarted","Data":"55081f2909d2d4900aba517715397d44bbfc6e57e069971a56d1fddaafd145d2"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.676830 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" event={"ID":"ea63dc95-4a48-4ed4-b990-c6990bbe3d33","Type":"ContainerStarted","Data":"f8b40f2b06e57adb178e2797f1de27b12ac06b5e2d16d9372f11f15244a09dd2"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.678208 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" event={"ID":"7d3873c8-7bab-42f8-918a-344d87eacce9","Type":"ContainerStarted","Data":"5d1c8b81a22526ed67af9f8062fd9a1426ce6b61b266ca1bc2636ca7cf5765c8"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.678534 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:39b8745c08b17b80eb7b9418f4031e5231da803282dd4b63dfe13e26185c65da\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" podUID="ea63dc95-4a48-4ed4-b990-c6990bbe3d33" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.679609 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" event={"ID":"1295c691-04d0-4e6e-a4e6-4f85c6715964","Type":"ContainerStarted","Data":"9d7aeb23567f7604ed3771dc37d1f0a7ad4bb77ebacc5e6642a766ce6f4024e5"} Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.684128 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" event={"ID":"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0","Type":"ContainerStarted","Data":"2c8f8d8d54463e8d3c19f9bfb13c8e54e524b313764769cd2d0d6e80313f7ecf"} Jun 13 05:02:49 crc kubenswrapper[4894]: E0613 05:02:49.685951 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:852ad251fd02b5ffd89018239d0435d4e0ac8f7f372245d74f32ac5931bf6958\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" podUID="4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0" Jun 13 05:02:49 crc kubenswrapper[4894]: I0613 05:02:49.859548 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-67ff8584d-fzgb7" podStartSLOduration=3.938406249 podStartE2EDuration="28.859535038s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.623884011 +0000 UTC m=+702.070131474" lastFinishedPulling="2025-06-13 05:02:48.5450128 +0000 UTC m=+726.991260263" observedRunningTime="2025-06-13 05:02:49.857878701 +0000 UTC m=+728.304126164" watchObservedRunningTime="2025-06-13 05:02:49.859535038 +0000 UTC m=+728.305782501" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.004720 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" podStartSLOduration=4.286018708 podStartE2EDuration="29.004705366s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.649224185 +0000 UTC m=+702.095471638" lastFinishedPulling="2025-06-13 05:02:48.367910833 +0000 UTC m=+726.814158296" observedRunningTime="2025-06-13 05:02:49.975809922 +0000 UTC m=+728.422057385" watchObservedRunningTime="2025-06-13 05:02:50.004705366 +0000 UTC m=+728.450952829" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.007365 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2j25" podStartSLOduration=4.839196598 podStartE2EDuration="26.007358331s" podCreationTimestamp="2025-06-13 05:02:24 +0000 UTC" firstStartedPulling="2025-06-13 05:02:26.447842096 +0000 UTC m=+704.894089559" lastFinishedPulling="2025-06-13 05:02:47.616003829 +0000 UTC m=+726.062251292" observedRunningTime="2025-06-13 05:02:50.006042764 +0000 UTC m=+728.452290227" watchObservedRunningTime="2025-06-13 05:02:50.007358331 +0000 UTC m=+728.453605784" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.692196 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" event={"ID":"6d32b2ff-59b7-4326-94e9-69e0fbd6ce34","Type":"ContainerStarted","Data":"497e349d5b7198c6c117a389aaf6b94346b044aa3a72ff481d7d948d27826fa6"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.692551 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.694105 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" event={"ID":"6f399f41-0f28-471b-be85-3468ff990e9d","Type":"ContainerStarted","Data":"b66980d3303d861b93cfe535ce38d2c61f8859a74a301e2216a76802efae4eef"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.695670 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" event={"ID":"c185ce61-38da-4eec-ab4d-4e73fbd9a957","Type":"ContainerStarted","Data":"79d9fc1326e593426bbdc11d7c14e6ba2413476d43fbe94526ab7ec97b976e9f"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.696130 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.697265 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" event={"ID":"99063b46-9295-41e6-8ad6-5e6cefce2931","Type":"ContainerStarted","Data":"829f092cc12cf2ddecedc2de12a95766301315fa9404df320462db7727983443"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.698857 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" event={"ID":"1295c691-04d0-4e6e-a4e6-4f85c6715964","Type":"ContainerStarted","Data":"84f294797de7ab4b910161bd2617606549d80672968fabb02d3e9bc6a3dd75a2"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.699434 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.700515 4894 generic.go:334] "Generic (PLEG): container finished" podID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerID="1ae3dbbba472a587b388d616e639682714571fc2ef33644a6d169b67e096cdc4" exitCode=0 Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.700636 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerDied","Data":"1ae3dbbba472a587b388d616e639682714571fc2ef33644a6d169b67e096cdc4"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.702176 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" event={"ID":"bad848ff-73e6-4dad-a141-feac145e5c38","Type":"ContainerStarted","Data":"edf3dca6b7abdf5d974dd186e25d0d670a373b178ba67609812c2713b924b3b4"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.702557 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.708876 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" event={"ID":"4c379ff5-1113-4698-a898-9c1cb29000cf","Type":"ContainerStarted","Data":"8e9d16b690423f50721a71ea775a0e94c45c2792ecfe03939310a526c5098ae3"} Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.708901 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.709693 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:5b5b3f3998b077e7a0fc2e4fbbc304142a56d765c1989f2afbd435e2697af8f7\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" podUID="784a682d-1749-4399-a1f4-1e8bee7968ce" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.709896 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:0e9cc04d1421ac129bbae99f4e8089a41cd6dcb768f3fbc14264a1b5968d8b60\\\"\"" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" podUID="292d16cc-5623-4aa8-a644-2e69a901ca6f" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.709928 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:852ad251fd02b5ffd89018239d0435d4e0ac8f7f372245d74f32ac5931bf6958\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" podUID="4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.709952 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:fe55d3eff25aee0e0a215c41b84431ce206544d0d0071df708c94db32d06dbf5\\\"\"" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" podUID="6e780a91-140a-4b7b-9748-c3a6c3b954e1" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.710349 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3758effecdae62091b67afe9ab364ea1e37018baadac101cd60f6529a7464139\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" podUID="43453734-49dd-48b0-86b4-46b20966f2f5" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.710422 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:0dd27821f0cf9de77f407662d64c69dc9b0f22e944bfbf03cb4f8805dc9e21c9\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" podUID="5c09333e-da20-4f48-96b9-29021e93149b" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.713769 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:376fa11edc40f68251027f0f30f9e0db312cc88fd9386985b26e357f6e487515\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" podUID="297946dc-5d6d-4389-bff3-3044865254ef" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.713828 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:39b8745c08b17b80eb7b9418f4031e5231da803282dd4b63dfe13e26185c65da\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" podUID="ea63dc95-4a48-4ed4-b990-c6990bbe3d33" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.713862 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:e6ff2e9067469a0ab6f3c88d670329e7d48d6e7b72a78d7c831bf25517adcacd\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" podUID="898d7bc9-6d9c-4e81-b72e-fdb6f7440b43" Jun 13 05:02:50 crc kubenswrapper[4894]: E0613 05:02:50.713890 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:47d1eacd07738b8dc59814467f756e12092d57c051b119be499f425ec738d607\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" podUID="de71738a-f07f-49c4-9820-1480db37be05" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.722001 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" podStartSLOduration=5.024840245 podStartE2EDuration="29.721991136s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.611930505 +0000 UTC m=+702.058177968" lastFinishedPulling="2025-06-13 05:02:48.309081396 +0000 UTC m=+726.755328859" observedRunningTime="2025-06-13 05:02:50.721131131 +0000 UTC m=+729.167378594" watchObservedRunningTime="2025-06-13 05:02:50.721991136 +0000 UTC m=+729.168238599" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.809286 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" podStartSLOduration=5.133780392 podStartE2EDuration="29.809257403s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.866325328 +0000 UTC m=+701.312572791" lastFinishedPulling="2025-06-13 05:02:47.541802339 +0000 UTC m=+725.988049802" observedRunningTime="2025-06-13 05:02:50.807279547 +0000 UTC m=+729.253527010" watchObservedRunningTime="2025-06-13 05:02:50.809257403 +0000 UTC m=+729.255504866" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.884858 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" podStartSLOduration=5.108227192 podStartE2EDuration="29.884843301s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.64939828 +0000 UTC m=+702.095645743" lastFinishedPulling="2025-06-13 05:02:48.426014389 +0000 UTC m=+726.872261852" observedRunningTime="2025-06-13 05:02:50.883481393 +0000 UTC m=+729.329728856" watchObservedRunningTime="2025-06-13 05:02:50.884843301 +0000 UTC m=+729.331090764" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.905260 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" podStartSLOduration=7.178595545 podStartE2EDuration="29.905247076s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:24.814732376 +0000 UTC m=+703.260979839" lastFinishedPulling="2025-06-13 05:02:47.541383907 +0000 UTC m=+725.987631370" observedRunningTime="2025-06-13 05:02:50.900559124 +0000 UTC m=+729.346806587" watchObservedRunningTime="2025-06-13 05:02:50.905247076 +0000 UTC m=+729.351494539" Jun 13 05:02:50 crc kubenswrapper[4894]: I0613 05:02:50.979131 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" podStartSLOduration=5.548810039 podStartE2EDuration="29.979116396s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.111492232 +0000 UTC m=+701.557739695" lastFinishedPulling="2025-06-13 05:02:47.541798589 +0000 UTC m=+725.988046052" observedRunningTime="2025-06-13 05:02:50.978939931 +0000 UTC m=+729.425187394" watchObservedRunningTime="2025-06-13 05:02:50.979116396 +0000 UTC m=+729.425363859" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.713867 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" event={"ID":"99063b46-9295-41e6-8ad6-5e6cefce2931","Type":"ContainerStarted","Data":"7aab6f8b3c8bf9e6ddd3f8d38a2073417af10d1f1f0f3dd90299fbe92a8a94b1"} Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.715016 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.719068 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerStarted","Data":"9b479e89ad21b47d6f3fa998d22c01556217fb87597ee4a3ee327a1446f0180c"} Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.722021 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" event={"ID":"250d2934-5f6e-4d4f-96d9-ec258c71909e","Type":"ContainerStarted","Data":"9544275e832d02a0675a8853a4dfb59c1310ef5fea5027835396d549a701a992"} Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.722107 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.725158 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" event={"ID":"6f399f41-0f28-471b-be85-3468ff990e9d","Type":"ContainerStarted","Data":"2731f6583265c7e53a2787012001fffae14141ec93690cd50aa24521d0159be3"} Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.725264 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.728976 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" event={"ID":"7d3873c8-7bab-42f8-918a-344d87eacce9","Type":"ContainerStarted","Data":"af73db3d2e7c50e6650905bb1f57bdb760217a74dd6c7b65da4e1d127ba36329"} Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.729834 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.739174 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" podStartSLOduration=6.048338037 podStartE2EDuration="30.73916463s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.611889374 +0000 UTC m=+702.058136837" lastFinishedPulling="2025-06-13 05:02:48.302715967 +0000 UTC m=+726.748963430" observedRunningTime="2025-06-13 05:02:51.736450184 +0000 UTC m=+730.182697647" watchObservedRunningTime="2025-06-13 05:02:51.73916463 +0000 UTC m=+730.185412093" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.761438 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" podStartSLOduration=6.093685063 podStartE2EDuration="30.761424117s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.496724149 +0000 UTC m=+700.942971602" lastFinishedPulling="2025-06-13 05:02:47.164463193 +0000 UTC m=+725.610710656" observedRunningTime="2025-06-13 05:02:51.757458325 +0000 UTC m=+730.203705788" watchObservedRunningTime="2025-06-13 05:02:51.761424117 +0000 UTC m=+730.207671580" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.799522 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" podStartSLOduration=6.271746288 podStartE2EDuration="30.799500999s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.076085885 +0000 UTC m=+701.522333348" lastFinishedPulling="2025-06-13 05:02:47.603840596 +0000 UTC m=+726.050088059" observedRunningTime="2025-06-13 05:02:51.794603471 +0000 UTC m=+730.240850934" watchObservedRunningTime="2025-06-13 05:02:51.799500999 +0000 UTC m=+730.245748462" Jun 13 05:02:51 crc kubenswrapper[4894]: I0613 05:02:51.815446 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" podStartSLOduration=5.465696948 podStartE2EDuration="30.815410757s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.192820531 +0000 UTC m=+700.639067994" lastFinishedPulling="2025-06-13 05:02:47.54253434 +0000 UTC m=+725.988781803" observedRunningTime="2025-06-13 05:02:51.811209759 +0000 UTC m=+730.257457222" watchObservedRunningTime="2025-06-13 05:02:51.815410757 +0000 UTC m=+730.261658220" Jun 13 05:02:52 crc kubenswrapper[4894]: I0613 05:02:52.766164 4894 generic.go:334] "Generic (PLEG): container finished" podID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerID="9b479e89ad21b47d6f3fa998d22c01556217fb87597ee4a3ee327a1446f0180c" exitCode=0 Jun 13 05:02:52 crc kubenswrapper[4894]: I0613 05:02:52.766330 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerDied","Data":"9b479e89ad21b47d6f3fa998d22c01556217fb87597ee4a3ee327a1446f0180c"} Jun 13 05:02:53 crc kubenswrapper[4894]: I0613 05:02:53.780064 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerStarted","Data":"a906cc013dd326881c943e6d5c8c8e55d9bc1073feb9d43c78c73a318f1f8f59"} Jun 13 05:02:53 crc kubenswrapper[4894]: I0613 05:02:53.817665 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bl55k" podStartSLOduration=4.309884261 podStartE2EDuration="6.817639151s" podCreationTimestamp="2025-06-13 05:02:47 +0000 UTC" firstStartedPulling="2025-06-13 05:02:50.703888966 +0000 UTC m=+729.150136429" lastFinishedPulling="2025-06-13 05:02:53.211643856 +0000 UTC m=+731.657891319" observedRunningTime="2025-06-13 05:02:53.816815828 +0000 UTC m=+732.263063331" watchObservedRunningTime="2025-06-13 05:02:53.817639151 +0000 UTC m=+732.263886614" Jun 13 05:02:54 crc kubenswrapper[4894]: I0613 05:02:54.771324 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:54 crc kubenswrapper[4894]: I0613 05:02:54.771640 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:02:55 crc kubenswrapper[4894]: I0613 05:02:55.824705 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-l2j25" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="registry-server" probeResult="failure" output=< Jun 13 05:02:55 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 05:02:55 crc kubenswrapper[4894]: > Jun 13 05:02:56 crc kubenswrapper[4894]: I0613 05:02:56.236780 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:02:56 crc kubenswrapper[4894]: I0613 05:02:56.236890 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:02:57 crc kubenswrapper[4894]: I0613 05:02:57.856126 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:57 crc kubenswrapper[4894]: I0613 05:02:57.856437 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:02:58 crc kubenswrapper[4894]: I0613 05:02:58.940693 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bl55k" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="registry-server" probeResult="failure" output=< Jun 13 05:02:58 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 05:02:58 crc kubenswrapper[4894]: > Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.377632 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-57f4dc9749-rf6b7" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.470084 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-97b97479c-jw8m6" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.585719 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-68f4bbb747-nfmz2" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.758122 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-75b8755b74-q5plz" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.783943 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5df6744645-ll2wl" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.831642 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7d4bbc7f54-r57lj" Jun 13 05:03:01 crc kubenswrapper[4894]: I0613 05:03:01.881459 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-664db87fd8-m64zp" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.082419 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9f78645d5-s9r55" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.231094 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6db7bffb67-rnhvc" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.312132 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.334563 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/crc-debug-svt4h"] Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.335574 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.337422 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vqjvc" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.513106 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.513472 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8whs\" (UniqueName: \"kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.615046 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.615156 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8whs\" (UniqueName: \"kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.615169 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.641966 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8whs\" (UniqueName: \"kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs\") pod \"crc-debug-svt4h\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.651105 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:02 crc kubenswrapper[4894]: W0613 05:03:02.680335 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ff0afd_61a6_465b_a838_a77417552ccc.slice/crio-d1bf62584cfb1205ee6e0d58f1026ffcbf20ea281eba25e5264a3d383591b5f8 WatchSource:0}: Error finding container d1bf62584cfb1205ee6e0d58f1026ffcbf20ea281eba25e5264a3d383591b5f8: Status 404 returned error can't find the container with id d1bf62584cfb1205ee6e0d58f1026ffcbf20ea281eba25e5264a3d383591b5f8 Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.852906 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/crc-debug-svt4h" event={"ID":"93ff0afd-61a6-465b-a838-a77417552ccc","Type":"ContainerStarted","Data":"d1bf62584cfb1205ee6e0d58f1026ffcbf20ea281eba25e5264a3d383591b5f8"} Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.854176 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" event={"ID":"292d16cc-5623-4aa8-a644-2e69a901ca6f","Type":"ContainerStarted","Data":"4f9177920873f8a18000d44185a5264d8782d6e02e8e804af49a664f0c2a5c99"} Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.854377 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:03:02 crc kubenswrapper[4894]: I0613 05:03:02.871691 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" podStartSLOduration=3.221862361 podStartE2EDuration="41.871674171s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.094567295 +0000 UTC m=+701.540814758" lastFinishedPulling="2025-06-13 05:03:01.744379105 +0000 UTC m=+740.190626568" observedRunningTime="2025-06-13 05:03:02.869632824 +0000 UTC m=+741.315880287" watchObservedRunningTime="2025-06-13 05:03:02.871674171 +0000 UTC m=+741.317921634" Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.811157 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt" Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.870288 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/crc-debug-svt4h" event={"ID":"93ff0afd-61a6-465b-a838-a77417552ccc","Type":"ContainerStarted","Data":"c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e"} Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.872947 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" event={"ID":"4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0","Type":"ContainerStarted","Data":"2cc565a53827f403a5241ea7efc0a25a058679762213bc123b98077d51cefdb2"} Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.873393 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.886841 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/crc-debug-svt4h" podStartSLOduration=1.886817628 podStartE2EDuration="1.886817628s" podCreationTimestamp="2025-06-13 05:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:03:03.885506641 +0000 UTC m=+742.331754114" watchObservedRunningTime="2025-06-13 05:03:03.886817628 +0000 UTC m=+742.333065091" Jun 13 05:03:03 crc kubenswrapper[4894]: I0613 05:03:03.910095 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" podStartSLOduration=3.035249345 podStartE2EDuration="42.910053822s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.083515224 +0000 UTC m=+701.529762687" lastFinishedPulling="2025-06-13 05:03:02.958319701 +0000 UTC m=+741.404567164" observedRunningTime="2025-06-13 05:03:03.910043702 +0000 UTC m=+742.356291165" watchObservedRunningTime="2025-06-13 05:03:03.910053822 +0000 UTC m=+742.356301295" Jun 13 05:03:04 crc kubenswrapper[4894]: I0613 05:03:04.849579 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:03:04 crc kubenswrapper[4894]: I0613 05:03:04.893906 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" event={"ID":"43453734-49dd-48b0-86b4-46b20966f2f5","Type":"ContainerStarted","Data":"d15a48e82e1b667bc0d01c6ab1357e797b664bdd29b39d5e5becbcb72421c987"} Jun 13 05:03:04 crc kubenswrapper[4894]: I0613 05:03:04.912100 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" podStartSLOduration=3.04860491 podStartE2EDuration="43.91207145s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.874310262 +0000 UTC m=+701.320557725" lastFinishedPulling="2025-06-13 05:03:03.737776762 +0000 UTC m=+742.184024265" observedRunningTime="2025-06-13 05:03:04.911533235 +0000 UTC m=+743.357780708" watchObservedRunningTime="2025-06-13 05:03:04.91207145 +0000 UTC m=+743.358318953" Jun 13 05:03:04 crc kubenswrapper[4894]: I0613 05:03:04.933701 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.082865 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.912968 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" event={"ID":"ea63dc95-4a48-4ed4-b990-c6990bbe3d33","Type":"ContainerStarted","Data":"b22bee8ff50943828386b357dcbbf8a70dc287e41a7ae372177e582358dced93"} Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.914058 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.916139 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" event={"ID":"6e780a91-140a-4b7b-9748-c3a6c3b954e1","Type":"ContainerStarted","Data":"27ad203809a77285d0da92a67b955c70086a78de4c5811103c8ce7f8bea8e53e"} Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.916269 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2j25" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="registry-server" containerID="cri-o://2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314" gracePeriod=2 Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.916630 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.942989 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" podStartSLOduration=2.75881802 podStartE2EDuration="44.942968241s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.590090458 +0000 UTC m=+701.036337921" lastFinishedPulling="2025-06-13 05:03:04.774240679 +0000 UTC m=+743.220488142" observedRunningTime="2025-06-13 05:03:05.936330084 +0000 UTC m=+744.382577587" watchObservedRunningTime="2025-06-13 05:03:05.942968241 +0000 UTC m=+744.389215714" Jun 13 05:03:05 crc kubenswrapper[4894]: I0613 05:03:05.960949 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" podStartSLOduration=2.6191855090000002 podStartE2EDuration="44.960931477s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.43104348 +0000 UTC m=+700.877290943" lastFinishedPulling="2025-06-13 05:03:04.772789448 +0000 UTC m=+743.219036911" observedRunningTime="2025-06-13 05:03:05.95747793 +0000 UTC m=+744.403725423" watchObservedRunningTime="2025-06-13 05:03:05.960931477 +0000 UTC m=+744.407178940" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.290954 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.315291 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqxgq\" (UniqueName: \"kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq\") pod \"122e318e-5319-4a9c-99a4-2148d77abf9a\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.315393 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities\") pod \"122e318e-5319-4a9c-99a4-2148d77abf9a\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.315424 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content\") pod \"122e318e-5319-4a9c-99a4-2148d77abf9a\" (UID: \"122e318e-5319-4a9c-99a4-2148d77abf9a\") " Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.318971 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities" (OuterVolumeSpecName: "utilities") pod "122e318e-5319-4a9c-99a4-2148d77abf9a" (UID: "122e318e-5319-4a9c-99a4-2148d77abf9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.326140 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq" (OuterVolumeSpecName: "kube-api-access-hqxgq") pod "122e318e-5319-4a9c-99a4-2148d77abf9a" (UID: "122e318e-5319-4a9c-99a4-2148d77abf9a"). InnerVolumeSpecName "kube-api-access-hqxgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.367380 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "122e318e-5319-4a9c-99a4-2148d77abf9a" (UID: "122e318e-5319-4a9c-99a4-2148d77abf9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.417077 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqxgq\" (UniqueName: \"kubernetes.io/projected/122e318e-5319-4a9c-99a4-2148d77abf9a-kube-api-access-hqxgq\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.417314 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.417378 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e318e-5319-4a9c-99a4-2148d77abf9a-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.926220 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" event={"ID":"898d7bc9-6d9c-4e81-b72e-fdb6f7440b43","Type":"ContainerStarted","Data":"b1ef18b4cd4f560f2a224a3b791f2f5bba743199285dfa195fbb0541c5f90cf0"} Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.926902 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.928693 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" event={"ID":"5c09333e-da20-4f48-96b9-29021e93149b","Type":"ContainerStarted","Data":"a5223c8a1673e7d057ffb835c07859a7296ea7c321f7833f65bed0b4ea61b0f8"} Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.929236 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.931849 4894 generic.go:334] "Generic (PLEG): container finished" podID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerID="2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314" exitCode=0 Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.931955 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerDied","Data":"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314"} Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.932002 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2j25" event={"ID":"122e318e-5319-4a9c-99a4-2148d77abf9a","Type":"ContainerDied","Data":"c84c8fc1d10bdca1608e2689982a91b546b83754faf63d83a064ac9b964e6dbc"} Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.932024 4894 scope.go:117] "RemoveContainer" containerID="2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.931927 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2j25" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.935678 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" event={"ID":"297946dc-5d6d-4389-bff3-3044865254ef","Type":"ContainerStarted","Data":"b91f4e8bdba190e9cbce826b418832fdd0de1b9c44ce1f190ba255be5a7ddd0a"} Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.936200 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.957332 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" podStartSLOduration=3.549770775 podStartE2EDuration="45.957314156s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.559511669 +0000 UTC m=+702.005759132" lastFinishedPulling="2025-06-13 05:03:05.96705505 +0000 UTC m=+744.413302513" observedRunningTime="2025-06-13 05:03:06.952290775 +0000 UTC m=+745.398538238" watchObservedRunningTime="2025-06-13 05:03:06.957314156 +0000 UTC m=+745.403561619" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.969245 4894 scope.go:117] "RemoveContainer" containerID="d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.985546 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" podStartSLOduration=3.7175519 podStartE2EDuration="45.985525611s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.555847605 +0000 UTC m=+702.002095068" lastFinishedPulling="2025-06-13 05:03:05.823821296 +0000 UTC m=+744.270068779" observedRunningTime="2025-06-13 05:03:06.979378628 +0000 UTC m=+745.425626101" watchObservedRunningTime="2025-06-13 05:03:06.985525611 +0000 UTC m=+745.431773074" Jun 13 05:03:06 crc kubenswrapper[4894]: I0613 05:03:06.999058 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" podStartSLOduration=3.062908425 podStartE2EDuration="45.999042482s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.886726672 +0000 UTC m=+701.332974135" lastFinishedPulling="2025-06-13 05:03:05.822860729 +0000 UTC m=+744.269108192" observedRunningTime="2025-06-13 05:03:06.992752694 +0000 UTC m=+745.439000147" watchObservedRunningTime="2025-06-13 05:03:06.999042482 +0000 UTC m=+745.445289945" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.010337 4894 scope.go:117] "RemoveContainer" containerID="17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.011257 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.018785 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2j25"] Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.044923 4894 scope.go:117] "RemoveContainer" containerID="2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314" Jun 13 05:03:07 crc kubenswrapper[4894]: E0613 05:03:07.045610 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314\": container with ID starting with 2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314 not found: ID does not exist" containerID="2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.045667 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314"} err="failed to get container status \"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314\": rpc error: code = NotFound desc = could not find container \"2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314\": container with ID starting with 2c29ab5de0141be812e8ead7169c137076cf91cbd40d6d911fe765470bfa0314 not found: ID does not exist" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.045697 4894 scope.go:117] "RemoveContainer" containerID="d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5" Jun 13 05:03:07 crc kubenswrapper[4894]: E0613 05:03:07.045985 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5\": container with ID starting with d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5 not found: ID does not exist" containerID="d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.046026 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5"} err="failed to get container status \"d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5\": rpc error: code = NotFound desc = could not find container \"d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5\": container with ID starting with d9e092eb3fe382518888134dbdd2d4a2224dffd1b9162b363bbf77f5c47234f5 not found: ID does not exist" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.046046 4894 scope.go:117] "RemoveContainer" containerID="17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471" Jun 13 05:03:07 crc kubenswrapper[4894]: E0613 05:03:07.046514 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471\": container with ID starting with 17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471 not found: ID does not exist" containerID="17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.046546 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471"} err="failed to get container status \"17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471\": rpc error: code = NotFound desc = could not find container \"17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471\": container with ID starting with 17394dd41682d047630889fa59ffd0c43a9904c025cd47125b4ef1a5b32c3471 not found: ID does not exist" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.930527 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.946541 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" event={"ID":"784a682d-1749-4399-a1f4-1e8bee7968ce","Type":"ContainerStarted","Data":"d1765d96a69cbe65a9a7c982467d7125818cfb35653db6290476de4a0b32e4e6"} Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.947259 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:03:07 crc kubenswrapper[4894]: I0613 05:03:07.987495 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" podStartSLOduration=2.138312566 podStartE2EDuration="46.987473916s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:22.06070288 +0000 UTC m=+700.506950343" lastFinishedPulling="2025-06-13 05:03:06.90986422 +0000 UTC m=+745.356111693" observedRunningTime="2025-06-13 05:03:07.979247444 +0000 UTC m=+746.425494917" watchObservedRunningTime="2025-06-13 05:03:07.987473916 +0000 UTC m=+746.433721389" Jun 13 05:03:08 crc kubenswrapper[4894]: I0613 05:03:08.014183 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:03:08 crc kubenswrapper[4894]: I0613 05:03:08.290158 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" path="/var/lib/kubelet/pods/122e318e-5319-4a9c-99a4-2148d77abf9a/volumes" Jun 13 05:03:08 crc kubenswrapper[4894]: I0613 05:03:08.957123 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" event={"ID":"de71738a-f07f-49c4-9820-1480db37be05","Type":"ContainerStarted","Data":"805ca74e186e33e282afaa3bd4af971b0f59d9ce53307d2628a5e78ac577b4a9"} Jun 13 05:03:08 crc kubenswrapper[4894]: I0613 05:03:08.957721 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:03:08 crc kubenswrapper[4894]: I0613 05:03:08.982277 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" podStartSLOduration=3.772680313 podStartE2EDuration="47.98224868s" podCreationTimestamp="2025-06-13 05:02:21 +0000 UTC" firstStartedPulling="2025-06-13 05:02:23.611994806 +0000 UTC m=+702.058242269" lastFinishedPulling="2025-06-13 05:03:07.821563143 +0000 UTC m=+746.267810636" observedRunningTime="2025-06-13 05:03:08.973420451 +0000 UTC m=+747.419667944" watchObservedRunningTime="2025-06-13 05:03:08.98224868 +0000 UTC m=+747.428496183" Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.490482 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.490942 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bl55k" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="registry-server" containerID="cri-o://a906cc013dd326881c943e6d5c8c8e55d9bc1073feb9d43c78c73a318f1f8f59" gracePeriod=2 Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.970347 4894 generic.go:334] "Generic (PLEG): container finished" podID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerID="a906cc013dd326881c943e6d5c8c8e55d9bc1073feb9d43c78c73a318f1f8f59" exitCode=0 Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.970945 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerDied","Data":"a906cc013dd326881c943e6d5c8c8e55d9bc1073feb9d43c78c73a318f1f8f59"} Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.970995 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl55k" event={"ID":"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3","Type":"ContainerDied","Data":"d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75"} Jun 13 05:03:09 crc kubenswrapper[4894]: I0613 05:03:09.971006 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a1bb59a30261b02e29d2f770f82ccf56337fc86b84b0b47f6b0b2facc9da75" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.029746 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.080534 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities\") pod \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.080641 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdbdm\" (UniqueName: \"kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm\") pod \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.080804 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content\") pod \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\" (UID: \"29ce5fae-aa1c-478a-ab81-8043e7d3c3c3\") " Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.081788 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities" (OuterVolumeSpecName: "utilities") pod "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" (UID: "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.092932 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm" (OuterVolumeSpecName: "kube-api-access-qdbdm") pod "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" (UID: "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3"). InnerVolumeSpecName "kube-api-access-qdbdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.151124 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" (UID: "29ce5fae-aa1c-478a-ab81-8043e7d3c3c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.183024 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.183066 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdbdm\" (UniqueName: \"kubernetes.io/projected/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-kube-api-access-qdbdm\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.183083 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:10 crc kubenswrapper[4894]: I0613 05:03:10.980434 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl55k" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.021403 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.033530 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bl55k"] Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.455548 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-b554678df-6trss" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.528249 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5486f4b54f-xdn4k" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.548254 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7777cf768b-bm84t" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.705014 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.708784 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5ccbd96f89-hrh2h" Jun 13 05:03:11 crc kubenswrapper[4894]: I0613 05:03:11.927051 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-857f9d6b88-pt7m6" Jun 13 05:03:12 crc kubenswrapper[4894]: I0613 05:03:12.083516 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-58f798889d-2n26t" Jun 13 05:03:12 crc kubenswrapper[4894]: I0613 05:03:12.157506 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7779c57cf7-7zldr" Jun 13 05:03:12 crc kubenswrapper[4894]: I0613 05:03:12.260531 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5b4ccb8c4-2mcf5" Jun 13 05:03:12 crc kubenswrapper[4894]: I0613 05:03:12.297639 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" path="/var/lib/kubelet/pods/29ce5fae-aa1c-478a-ab81-8043e7d3c3c3/volumes" Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.198453 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/crc-debug-svt4h"] Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.198704 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/crc-debug-svt4h" podUID="93ff0afd-61a6-465b-a838-a77417552ccc" containerName="container-00" containerID="cri-o://c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e" gracePeriod=2 Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.204536 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/crc-debug-svt4h"] Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.267124 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:13 crc kubenswrapper[4894]: E0613 05:03:13.320247 4894 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ff0afd_61a6_465b_a838_a77417552ccc.slice/crio-c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ff0afd_61a6_465b_a838_a77417552ccc.slice/crio-conmon-c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e.scope\": RecentStats: unable to find data in memory cache]" Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.330795 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8whs\" (UniqueName: \"kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs\") pod \"93ff0afd-61a6-465b-a838-a77417552ccc\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.331004 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host\") pod \"93ff0afd-61a6-465b-a838-a77417552ccc\" (UID: \"93ff0afd-61a6-465b-a838-a77417552ccc\") " Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.331314 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host" (OuterVolumeSpecName: "host") pod "93ff0afd-61a6-465b-a838-a77417552ccc" (UID: "93ff0afd-61a6-465b-a838-a77417552ccc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.335902 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs" (OuterVolumeSpecName: "kube-api-access-f8whs") pod "93ff0afd-61a6-465b-a838-a77417552ccc" (UID: "93ff0afd-61a6-465b-a838-a77417552ccc"). InnerVolumeSpecName "kube-api-access-f8whs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.432238 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ff0afd-61a6-465b-a838-a77417552ccc-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:13 crc kubenswrapper[4894]: I0613 05:03:13.432277 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8whs\" (UniqueName: \"kubernetes.io/projected/93ff0afd-61a6-465b-a838-a77417552ccc-kube-api-access-f8whs\") on node \"crc\" DevicePath \"\"" Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.012548 4894 generic.go:334] "Generic (PLEG): container finished" podID="93ff0afd-61a6-465b-a838-a77417552ccc" containerID="c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e" exitCode=0 Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.012688 4894 scope.go:117] "RemoveContainer" containerID="c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e" Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.012792 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/crc-debug-svt4h" Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.044394 4894 scope.go:117] "RemoveContainer" containerID="c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e" Jun 13 05:03:14 crc kubenswrapper[4894]: E0613 05:03:14.044919 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e\": container with ID starting with c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e not found: ID does not exist" containerID="c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e" Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.044969 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e"} err="failed to get container status \"c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e\": rpc error: code = NotFound desc = could not find container \"c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e\": container with ID starting with c910b25ddb638424bade11f5df3769fc2d5d06919a687c96fa84d9a21b69b96e not found: ID does not exist" Jun 13 05:03:14 crc kubenswrapper[4894]: I0613 05:03:14.291341 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ff0afd-61a6-465b-a838-a77417552ccc" path="/var/lib/kubelet/pods/93ff0afd-61a6-465b-a838-a77417552ccc/volumes" Jun 13 05:03:21 crc kubenswrapper[4894]: I0613 05:03:21.369070 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-9889b4756-lsslv" Jun 13 05:03:22 crc kubenswrapper[4894]: I0613 05:03:22.200579 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-884d667-sk2l9" Jun 13 05:03:26 crc kubenswrapper[4894]: I0613 05:03:26.236835 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:03:26 crc kubenswrapper[4894]: I0613 05:03:26.237615 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:03:26 crc kubenswrapper[4894]: I0613 05:03:26.237708 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:03:26 crc kubenswrapper[4894]: I0613 05:03:26.238842 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:03:26 crc kubenswrapper[4894]: I0613 05:03:26.238965 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe" gracePeriod=600 Jun 13 05:03:27 crc kubenswrapper[4894]: I0613 05:03:27.127759 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe" exitCode=0 Jun 13 05:03:27 crc kubenswrapper[4894]: I0613 05:03:27.128173 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe"} Jun 13 05:03:27 crc kubenswrapper[4894]: I0613 05:03:27.128217 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47"} Jun 13 05:03:27 crc kubenswrapper[4894]: I0613 05:03:27.128246 4894 scope.go:117] "RemoveContainer" containerID="f7fc4190cac312996663010960a2fee97deb02b3216bd4f6efea74f02e4a5efa" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.441242 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.441983 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="extract-utilities" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.441995 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="extract-utilities" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442022 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442028 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442043 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442052 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442067 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="extract-utilities" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442073 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="extract-utilities" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442100 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ff0afd-61a6-465b-a838-a77417552ccc" containerName="container-00" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442105 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ff0afd-61a6-465b-a838-a77417552ccc" containerName="container-00" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442120 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="extract-content" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442125 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="extract-content" Jun 13 05:03:39 crc kubenswrapper[4894]: E0613 05:03:39.442139 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="extract-content" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442145 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="extract-content" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442287 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ff0afd-61a6-465b-a838-a77417552ccc" containerName="container-00" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442296 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="122e318e-5319-4a9c-99a4-2148d77abf9a" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442305 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ce5fae-aa1c-478a-ab81-8043e7d3c3c3" containerName="registry-server" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.442949 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.447526 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.447553 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.447728 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.447856 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-d6q4j" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.459384 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.502565 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.502665 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvn4\" (UniqueName: \"kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.514767 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.515851 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.521216 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.532369 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.604085 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.604134 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.604183 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.604226 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsqc\" (UniqueName: \"kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.604252 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvn4\" (UniqueName: \"kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.605329 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.636600 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvn4\" (UniqueName: \"kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4\") pod \"dnsmasq-dns-546489d6df-hwtjg\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.705463 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.705520 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.705566 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsqc\" (UniqueName: \"kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.706687 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.707239 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.727923 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsqc\" (UniqueName: \"kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc\") pod \"dnsmasq-dns-7566756bf-pvnxx\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.757761 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:03:39 crc kubenswrapper[4894]: I0613 05:03:39.827837 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:03:40 crc kubenswrapper[4894]: I0613 05:03:40.235145 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:03:40 crc kubenswrapper[4894]: I0613 05:03:40.323319 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:03:40 crc kubenswrapper[4894]: W0613 05:03:40.326073 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod362d8879_aff0_4e42_b7b8_e6c7afedec8e.slice/crio-c9d834e7e8eaa93d3f57dca30055ca7d5dbbe3909bdc0cbe2665cdbdc8e1061a WatchSource:0}: Error finding container c9d834e7e8eaa93d3f57dca30055ca7d5dbbe3909bdc0cbe2665cdbdc8e1061a: Status 404 returned error can't find the container with id c9d834e7e8eaa93d3f57dca30055ca7d5dbbe3909bdc0cbe2665cdbdc8e1061a Jun 13 05:03:41 crc kubenswrapper[4894]: I0613 05:03:41.249403 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" event={"ID":"b8090f0c-46ca-42e0-9928-06a7862fdf45","Type":"ContainerStarted","Data":"760fdefac7511b230d34cb1b68682b96470825d37592b8259f71f737c3b272ae"} Jun 13 05:03:41 crc kubenswrapper[4894]: I0613 05:03:41.250847 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" event={"ID":"362d8879-aff0-4e42-b7b8-e6c7afedec8e","Type":"ContainerStarted","Data":"c9d834e7e8eaa93d3f57dca30055ca7d5dbbe3909bdc0cbe2665cdbdc8e1061a"} Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.351532 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.352063 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.354891 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.390243 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.461637 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnplk\" (UniqueName: \"kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.461723 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.461766 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.564934 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnplk\" (UniqueName: \"kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.564987 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.565011 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.565942 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.566180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.598016 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnplk\" (UniqueName: \"kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk\") pod \"dnsmasq-dns-5544c68b5-lbkpx\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.613621 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.627392 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.628490 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.647276 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.689042 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.767258 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.767343 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.767362 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4drbw\" (UniqueName: \"kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.868979 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.869347 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.869374 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4drbw\" (UniqueName: \"kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.870399 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.870944 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.899503 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4drbw\" (UniqueName: \"kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw\") pod \"dnsmasq-dns-d5db84f4f-r2x2r\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:42 crc kubenswrapper[4894]: I0613 05:03:42.955924 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.477012 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:03:43 crc kubenswrapper[4894]: W0613 05:03:43.482608 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af3e3f7_f02d_4b2c_8662_435448da1ca5.slice/crio-65131294e9edcb0a7fc260e0eafac23b35ccd034ab90bb27e90332c0aa58d6b3 WatchSource:0}: Error finding container 65131294e9edcb0a7fc260e0eafac23b35ccd034ab90bb27e90332c0aa58d6b3: Status 404 returned error can't find the container with id 65131294e9edcb0a7fc260e0eafac23b35ccd034ab90bb27e90332c0aa58d6b3 Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.502531 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.503849 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509203 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509237 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509428 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509640 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509849 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.509977 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gqlpx" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.511200 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.523135 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.611431 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:03:43 crc kubenswrapper[4894]: W0613 05:03:43.625033 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f1392b_00f5_4ed6_8f88_0a7a79134e67.slice/crio-012887b33bdfdedb0552038bc3b30eeb03ff965a1d441dc5cd24ba991ef46ffb WatchSource:0}: Error finding container 012887b33bdfdedb0552038bc3b30eeb03ff965a1d441dc5cd24ba991ef46ffb: Status 404 returned error can't find the container with id 012887b33bdfdedb0552038bc3b30eeb03ff965a1d441dc5cd24ba991ef46ffb Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692301 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692339 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692409 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692429 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67fn\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692463 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692478 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692493 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692513 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692539 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.692555 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793453 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793491 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793510 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793537 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793567 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793583 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793602 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793623 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793665 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793711 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.793727 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67fn\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.794008 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.794578 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.794805 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.795025 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.797149 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.799152 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.799427 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.799867 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.807743 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.812097 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.817788 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67fn\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.831227 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.834696 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.943533 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.987771 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.988088 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.994843 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.996208 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.996313 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.996382 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7sczr" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.996511 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jun 13 05:03:43 crc kubenswrapper[4894]: I0613 05:03:43.996535 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.000953 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097028 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbm6\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097088 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097338 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097435 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097469 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097486 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097501 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097518 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097538 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097566 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.097583 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.198671 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.198982 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199002 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199020 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199055 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199086 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199120 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199137 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199171 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbm6\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199205 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.199226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.200050 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.200954 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.201480 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.201705 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.202105 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.204990 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.205801 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.206445 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.207975 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.209317 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.219135 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbm6\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.237225 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.311103 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.324019 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:03:44 crc kubenswrapper[4894]: W0613 05:03:44.327135 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5347c46f_ac9a_4ec1_bf62_29e88fb89033.slice/crio-d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75 WatchSource:0}: Error finding container d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75: Status 404 returned error can't find the container with id d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75 Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.353870 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" event={"ID":"6af3e3f7-f02d-4b2c-8662-435448da1ca5","Type":"ContainerStarted","Data":"65131294e9edcb0a7fc260e0eafac23b35ccd034ab90bb27e90332c0aa58d6b3"} Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.360921 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerStarted","Data":"d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75"} Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.381260 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" event={"ID":"96f1392b-00f5-4ed6-8f88-0a7a79134e67","Type":"ContainerStarted","Data":"012887b33bdfdedb0552038bc3b30eeb03ff965a1d441dc5cd24ba991ef46ffb"} Jun 13 05:03:44 crc kubenswrapper[4894]: I0613 05:03:44.872950 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.763057 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.764727 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.770812 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zn6xn" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.771907 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.772116 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.772450 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.773287 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.773293 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.782843 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946291 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946330 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-kolla-config\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946451 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-default\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946579 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6d6\" (UniqueName: \"kubernetes.io/projected/138daa45-0563-4c44-8b99-9bfb66eea5c6-kube-api-access-sn6d6\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946766 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-secrets\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946911 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.946955 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.947210 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:45 crc kubenswrapper[4894]: I0613 05:03:45.947235 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.048921 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.048958 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049011 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049028 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049046 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049062 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-kolla-config\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049081 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-default\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049107 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6d6\" (UniqueName: \"kubernetes.io/projected/138daa45-0563-4c44-8b99-9bfb66eea5c6-kube-api-access-sn6d6\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049141 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-secrets\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.049271 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.054681 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-kolla-config\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.054868 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-default\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.055330 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/138daa45-0563-4c44-8b99-9bfb66eea5c6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.056107 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138daa45-0563-4c44-8b99-9bfb66eea5c6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.065845 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-secrets\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.066357 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.068050 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6d6\" (UniqueName: \"kubernetes.io/projected/138daa45-0563-4c44-8b99-9bfb66eea5c6-kube-api-access-sn6d6\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.068433 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/138daa45-0563-4c44-8b99-9bfb66eea5c6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.076780 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"138daa45-0563-4c44-8b99-9bfb66eea5c6\") " pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.101476 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.212285 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.214182 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.219862 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.220854 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8jbw6" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.220892 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.221107 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.227262 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357626 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357702 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357846 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357917 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357944 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglcw\" (UniqueName: \"kubernetes.io/projected/5e584455-5537-425d-a454-063087cc3fea-kube-api-access-rglcw\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.357966 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.358046 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.358153 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e584455-5537-425d-a454-063087cc3fea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.358187 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459545 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459620 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459641 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglcw\" (UniqueName: \"kubernetes.io/projected/5e584455-5537-425d-a454-063087cc3fea-kube-api-access-rglcw\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459669 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459696 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459750 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e584455-5537-425d-a454-063087cc3fea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459771 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459796 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.459847 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.460288 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.460326 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.460560 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5e584455-5537-425d-a454-063087cc3fea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.460818 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.461747 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e584455-5537-425d-a454-063087cc3fea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.463554 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.482247 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.482297 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e584455-5537-425d-a454-063087cc3fea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.484864 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglcw\" (UniqueName: \"kubernetes.io/projected/5e584455-5537-425d-a454-063087cc3fea-kube-api-access-rglcw\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.496440 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5e584455-5537-425d-a454-063087cc3fea\") " pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.540053 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.755885 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.757037 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.759937 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-l6hlm" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.760259 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.760443 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.766971 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.866305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrb6\" (UniqueName: \"kubernetes.io/projected/1bb10c27-9b94-43cd-82df-407e68605449-kube-api-access-ntrb6\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.866362 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-config-data\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.866417 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.866485 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.866503 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-kolla-config\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.967845 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.967903 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-kolla-config\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.967926 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrb6\" (UniqueName: \"kubernetes.io/projected/1bb10c27-9b94-43cd-82df-407e68605449-kube-api-access-ntrb6\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.967972 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-config-data\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.968033 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.969627 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-kolla-config\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.969788 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1bb10c27-9b94-43cd-82df-407e68605449-config-data\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.979846 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.987453 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb10c27-9b94-43cd-82df-407e68605449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:46 crc kubenswrapper[4894]: I0613 05:03:46.993998 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrb6\" (UniqueName: \"kubernetes.io/projected/1bb10c27-9b94-43cd-82df-407e68605449-kube-api-access-ntrb6\") pod \"memcached-0\" (UID: \"1bb10c27-9b94-43cd-82df-407e68605449\") " pod="openstack/memcached-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.140377 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.591316 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.592329 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.598492 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.598746 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2fv9w" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.703301 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqvq\" (UniqueName: \"kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq\") pod \"kube-state-metrics-0\" (UID: \"eaa87fe1-544c-4780-a350-acb43e14d346\") " pod="openstack/kube-state-metrics-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.805949 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqvq\" (UniqueName: \"kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq\") pod \"kube-state-metrics-0\" (UID: \"eaa87fe1-544c-4780-a350-acb43e14d346\") " pod="openstack/kube-state-metrics-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.841843 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqvq\" (UniqueName: \"kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq\") pod \"kube-state-metrics-0\" (UID: \"eaa87fe1-544c-4780-a350-acb43e14d346\") " pod="openstack/kube-state-metrics-0" Jun 13 05:03:47 crc kubenswrapper[4894]: I0613 05:03:47.917487 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.831673 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.832768 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.835533 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qwwks" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.835847 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.835939 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.843021 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.962683 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.962728 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcmz\" (UniqueName: \"kubernetes.io/projected/e3d6bb95-a363-4ac9-8034-dff9e9642464-kube-api-access-7pcmz\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.962797 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.962834 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:50 crc kubenswrapper[4894]: I0613 05:03:50.962858 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3d6bb95-a363-4ac9-8034-dff9e9642464-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.063949 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.064008 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.064045 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3d6bb95-a363-4ac9-8034-dff9e9642464-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.064075 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.064094 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcmz\" (UniqueName: \"kubernetes.io/projected/e3d6bb95-a363-4ac9-8034-dff9e9642464-kube-api-access-7pcmz\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.064952 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.065790 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3d6bb95-a363-4ac9-8034-dff9e9642464-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.071820 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.073321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d6bb95-a363-4ac9-8034-dff9e9642464-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.083134 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcmz\" (UniqueName: \"kubernetes.io/projected/e3d6bb95-a363-4ac9-8034-dff9e9642464-kube-api-access-7pcmz\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.084389 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e3d6bb95-a363-4ac9-8034-dff9e9642464\") " pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.148843 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.814840 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5zm5p"] Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.815689 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.818922 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6pr7r" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.819757 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.819928 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.898192 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5zm5p"] Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.902578 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vb6p4"] Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.903977 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:51 crc kubenswrapper[4894]: I0613 05:03:51.921177 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vb6p4"] Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003230 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-log\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003334 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-ovn-controller-tls-certs\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-run\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003405 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e715a67e-623b-4d05-8bc9-676747d445fb-scripts\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003430 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-log-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003461 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-lib\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003631 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/138962ec-89d1-4771-adad-e9a0d910e80b-scripts\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003716 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pv7x\" (UniqueName: \"kubernetes.io/projected/e715a67e-623b-4d05-8bc9-676747d445fb-kube-api-access-6pv7x\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003780 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qr6z\" (UniqueName: \"kubernetes.io/projected/138962ec-89d1-4771-adad-e9a0d910e80b-kube-api-access-4qr6z\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003947 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-etc-ovs\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.003982 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-combined-ca-bundle\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.004115 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105133 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e715a67e-623b-4d05-8bc9-676747d445fb-scripts\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105188 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-log-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105225 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-lib\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105254 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/138962ec-89d1-4771-adad-e9a0d910e80b-scripts\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105287 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pv7x\" (UniqueName: \"kubernetes.io/projected/e715a67e-623b-4d05-8bc9-676747d445fb-kube-api-access-6pv7x\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105318 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qr6z\" (UniqueName: \"kubernetes.io/projected/138962ec-89d1-4771-adad-e9a0d910e80b-kube-api-access-4qr6z\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105380 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-combined-ca-bundle\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105403 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-etc-ovs\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105448 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105481 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105508 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-log\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105531 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-ovn-controller-tls-certs\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105556 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-run\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105737 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-log-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105826 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-run\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.105996 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-etc-ovs\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.106096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run-ovn\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.106150 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e715a67e-623b-4d05-8bc9-676747d445fb-var-run\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.106265 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-log\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.106862 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/138962ec-89d1-4771-adad-e9a0d910e80b-var-lib\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.107947 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e715a67e-623b-4d05-8bc9-676747d445fb-scripts\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.111610 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-ovn-controller-tls-certs\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.121299 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pv7x\" (UniqueName: \"kubernetes.io/projected/e715a67e-623b-4d05-8bc9-676747d445fb-kube-api-access-6pv7x\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.121451 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715a67e-623b-4d05-8bc9-676747d445fb-combined-ca-bundle\") pod \"ovn-controller-5zm5p\" (UID: \"e715a67e-623b-4d05-8bc9-676747d445fb\") " pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.109384 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/138962ec-89d1-4771-adad-e9a0d910e80b-scripts\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.123776 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qr6z\" (UniqueName: \"kubernetes.io/projected/138962ec-89d1-4771-adad-e9a0d910e80b-kube-api-access-4qr6z\") pod \"ovn-controller-ovs-vb6p4\" (UID: \"138962ec-89d1-4771-adad-e9a0d910e80b\") " pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.137330 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.217190 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:03:52 crc kubenswrapper[4894]: I0613 05:03:52.461247 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerStarted","Data":"4ec2a238d97949879d16f86bb45b20c81ae97eb58368d9ede1849064cb415c87"} Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.677763 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.681891 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.717055 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jptdq" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.717340 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.717581 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.723892 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.871390 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.871798 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.871883 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.871953 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.872093 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rxkn\" (UniqueName: \"kubernetes.io/projected/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-kube-api-access-8rxkn\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.973993 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.974058 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.974105 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.974142 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rxkn\" (UniqueName: \"kubernetes.io/projected/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-kube-api-access-8rxkn\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.974186 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.974370 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.981860 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:54 crc kubenswrapper[4894]: I0613 05:03:54.985455 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:55 crc kubenswrapper[4894]: I0613 05:03:55.006510 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:55 crc kubenswrapper[4894]: I0613 05:03:55.007730 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rxkn\" (UniqueName: \"kubernetes.io/projected/d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32-kube-api-access-8rxkn\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:55 crc kubenswrapper[4894]: I0613 05:03:55.039232 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32\") " pod="openstack/ovsdbserver-sb-0" Jun 13 05:03:55 crc kubenswrapper[4894]: I0613 05:03:55.335817 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.036520 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.574350 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-xfb9q"] Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.575526 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.578242 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.608082 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmndf\" (UniqueName: \"kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.608165 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.709807 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.709923 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmndf\" (UniqueName: \"kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.709945 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.733000 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmndf\" (UniqueName: \"kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf\") pod \"crc-debug-xfb9q\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " pod="openstack/crc-debug-xfb9q" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.799536 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.799719 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --no-daemon --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwvn4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-546489d6df-hwtjg_openstack(b8090f0c-46ca-42e0-9928-06a7862fdf45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.801570 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" podUID="b8090f0c-46ca-42e0-9928-06a7862fdf45" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.806081 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.806278 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --no-daemon --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4drbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d5db84f4f-r2x2r_openstack(96f1392b-00f5-4ed6-8f88-0a7a79134e67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:04:01 crc kubenswrapper[4894]: E0613 05:04:01.808736 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" Jun 13 05:04:01 crc kubenswrapper[4894]: I0613 05:04:01.900647 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xfb9q" Jun 13 05:04:02 crc kubenswrapper[4894]: E0613 05:04:02.543385 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" Jun 13 05:04:03 crc kubenswrapper[4894]: W0613 05:04:03.088646 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa87fe1_544c_4780_a350_acb43e14d346.slice/crio-068f6e763ee6b33a6e948b3a9434ad07499175da44552bf35f033683890776ec WatchSource:0}: Error finding container 068f6e763ee6b33a6e948b3a9434ad07499175da44552bf35f033683890776ec: Status 404 returned error can't find the container with id 068f6e763ee6b33a6e948b3a9434ad07499175da44552bf35f033683890776ec Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.136764 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.136903 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --no-daemon --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnplk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5544c68b5-lbkpx_openstack(6af3e3f7-f02d-4b2c-8662-435448da1ca5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.138127 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.154782 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.154903 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --no-daemon --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xsqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7566756bf-pvnxx_openstack(362d8879-aff0-4e42-b7b8-e6c7afedec8e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.156069 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" podUID="362d8879-aff0-4e42-b7b8-e6c7afedec8e" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.268390 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.433868 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config\") pod \"b8090f0c-46ca-42e0-9928-06a7862fdf45\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.434386 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvn4\" (UniqueName: \"kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4\") pod \"b8090f0c-46ca-42e0-9928-06a7862fdf45\" (UID: \"b8090f0c-46ca-42e0-9928-06a7862fdf45\") " Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.440546 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config" (OuterVolumeSpecName: "config") pod "b8090f0c-46ca-42e0-9928-06a7862fdf45" (UID: "b8090f0c-46ca-42e0-9928-06a7862fdf45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.442550 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4" (OuterVolumeSpecName: "kube-api-access-cwvn4") pod "b8090f0c-46ca-42e0-9928-06a7862fdf45" (UID: "b8090f0c-46ca-42e0-9928-06a7862fdf45"). InnerVolumeSpecName "kube-api-access-cwvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.536388 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvn4\" (UniqueName: \"kubernetes.io/projected/b8090f0c-46ca-42e0-9928-06a7862fdf45-kube-api-access-cwvn4\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.536414 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8090f0c-46ca-42e0-9928-06a7862fdf45-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.566457 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xfb9q" event={"ID":"be04e48a-577c-4d75-9a73-30081f8881ac","Type":"ContainerStarted","Data":"21d537595f98f9262b192099f62b0918125f6b28f6c7a2cd7f8e83eae326bbc8"} Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.566500 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xfb9q" event={"ID":"be04e48a-577c-4d75-9a73-30081f8881ac","Type":"ContainerStarted","Data":"4f2e3b7672052251fcd769323a896a662c3e0e67cba5e14b1421afb0b46dd517"} Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.582379 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-xfb9q" podStartSLOduration=2.58236557 podStartE2EDuration="2.58236557s" podCreationTimestamp="2025-06-13 05:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:04:03.579642224 +0000 UTC m=+802.025889687" watchObservedRunningTime="2025-06-13 05:04:03.58236557 +0000 UTC m=+802.028613033" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.589072 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" event={"ID":"b8090f0c-46ca-42e0-9928-06a7862fdf45","Type":"ContainerDied","Data":"760fdefac7511b230d34cb1b68682b96470825d37592b8259f71f737c3b272ae"} Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.589083 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546489d6df-hwtjg" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.600161 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eaa87fe1-544c-4780-a350-acb43e14d346","Type":"ContainerStarted","Data":"068f6e763ee6b33a6e948b3a9434ad07499175da44552bf35f033683890776ec"} Jun 13 05:04:03 crc kubenswrapper[4894]: E0613 05:04:03.602088 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.706964 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.710255 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546489d6df-hwtjg"] Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.719155 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.749958 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5zm5p"] Jun 13 05:04:03 crc kubenswrapper[4894]: W0613 05:04:03.757868 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode715a67e_623b_4d05_8bc9_676747d445fb.slice/crio-e69365a8e76426c7cfa223f7209126eaf1af5f7aeacd4b35d1a147aa88ed80fc WatchSource:0}: Error finding container e69365a8e76426c7cfa223f7209126eaf1af5f7aeacd4b35d1a147aa88ed80fc: Status 404 returned error can't find the container with id e69365a8e76426c7cfa223f7209126eaf1af5f7aeacd4b35d1a147aa88ed80fc Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.862284 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jun 13 05:04:03 crc kubenswrapper[4894]: I0613 05:04:03.937064 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jun 13 05:04:04 crc kubenswrapper[4894]: W0613 05:04:04.013545 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e584455_5537_425d_a454_063087cc3fea.slice/crio-b2ba2b9f6356bba6c6b04d4671bd06a7245604851d15b66e4f0563e44ae25cac WatchSource:0}: Error finding container b2ba2b9f6356bba6c6b04d4671bd06a7245604851d15b66e4f0563e44ae25cac: Status 404 returned error can't find the container with id b2ba2b9f6356bba6c6b04d4671bd06a7245604851d15b66e4f0563e44ae25cac Jun 13 05:04:04 crc kubenswrapper[4894]: W0613 05:04:04.014922 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138daa45_0563_4c44_8b99_9bfb66eea5c6.slice/crio-ff55578dbdfba17e8ab9bbbea8ae1590f3f2241d9f1c72e1da6b69f43358a03b WatchSource:0}: Error finding container ff55578dbdfba17e8ab9bbbea8ae1590f3f2241d9f1c72e1da6b69f43358a03b: Status 404 returned error can't find the container with id ff55578dbdfba17e8ab9bbbea8ae1590f3f2241d9f1c72e1da6b69f43358a03b Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.202665 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.288377 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8090f0c-46ca-42e0-9928-06a7862fdf45" path="/var/lib/kubelet/pods/b8090f0c-46ca-42e0-9928-06a7862fdf45/volumes" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.353811 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsqc\" (UniqueName: \"kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc\") pod \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.353880 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config\") pod \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.353905 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc\") pod \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\" (UID: \"362d8879-aff0-4e42-b7b8-e6c7afedec8e\") " Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.354784 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config" (OuterVolumeSpecName: "config") pod "362d8879-aff0-4e42-b7b8-e6c7afedec8e" (UID: "362d8879-aff0-4e42-b7b8-e6c7afedec8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.355128 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "362d8879-aff0-4e42-b7b8-e6c7afedec8e" (UID: "362d8879-aff0-4e42-b7b8-e6c7afedec8e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.374380 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc" (OuterVolumeSpecName: "kube-api-access-6xsqc") pod "362d8879-aff0-4e42-b7b8-e6c7afedec8e" (UID: "362d8879-aff0-4e42-b7b8-e6c7afedec8e"). InnerVolumeSpecName "kube-api-access-6xsqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.426702 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.455510 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsqc\" (UniqueName: \"kubernetes.io/projected/362d8879-aff0-4e42-b7b8-e6c7afedec8e-kube-api-access-6xsqc\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.455546 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.455555 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/362d8879-aff0-4e42-b7b8-e6c7afedec8e-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:04 crc kubenswrapper[4894]: W0613 05:04:04.516167 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6d3ebc6_d7cd_4b8d_bc3f_70f6e8819a32.slice/crio-11034ca1a01ed5101b5b8513fe898dfc692eb840c3e54e3af8396ed4f18c591c WatchSource:0}: Error finding container 11034ca1a01ed5101b5b8513fe898dfc692eb840c3e54e3af8396ed4f18c591c: Status 404 returned error can't find the container with id 11034ca1a01ed5101b5b8513fe898dfc692eb840c3e54e3af8396ed4f18c591c Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.608201 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5zm5p" event={"ID":"e715a67e-623b-4d05-8bc9-676747d445fb","Type":"ContainerStarted","Data":"e69365a8e76426c7cfa223f7209126eaf1af5f7aeacd4b35d1a147aa88ed80fc"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.609663 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" event={"ID":"362d8879-aff0-4e42-b7b8-e6c7afedec8e","Type":"ContainerDied","Data":"c9d834e7e8eaa93d3f57dca30055ca7d5dbbe3909bdc0cbe2665cdbdc8e1061a"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.609723 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7566756bf-pvnxx" Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.617958 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32","Type":"ContainerStarted","Data":"11034ca1a01ed5101b5b8513fe898dfc692eb840c3e54e3af8396ed4f18c591c"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.619524 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1bb10c27-9b94-43cd-82df-407e68605449","Type":"ContainerStarted","Data":"ecd023236de57c827a88a0103692a298320a8c7a27af82ab3c02d77f061e51a0"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.621050 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5e584455-5537-425d-a454-063087cc3fea","Type":"ContainerStarted","Data":"b2ba2b9f6356bba6c6b04d4671bd06a7245604851d15b66e4f0563e44ae25cac"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.622494 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerStarted","Data":"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.624964 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"138daa45-0563-4c44-8b99-9bfb66eea5c6","Type":"ContainerStarted","Data":"ff55578dbdfba17e8ab9bbbea8ae1590f3f2241d9f1c72e1da6b69f43358a03b"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.626344 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerStarted","Data":"597c737a6ceae50c71e16da94b68f9fae3b11b7790de42f254bc4ad65fbd9c53"} Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.701343 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:04:04 crc kubenswrapper[4894]: I0613 05:04:04.707577 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7566756bf-pvnxx"] Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.076776 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vb6p4"] Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.295727 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jun 13 05:04:05 crc kubenswrapper[4894]: E0613 05:04:05.626712 4894 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:34162->38.102.83.213:40951: write tcp 38.102.83.213:34162->38.102.83.213:40951: write: broken pipe Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.639282 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-xfb9q"] Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.639465 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-xfb9q" podUID="be04e48a-577c-4d75-9a73-30081f8881ac" containerName="container-00" containerID="cri-o://21d537595f98f9262b192099f62b0918125f6b28f6c7a2cd7f8e83eae326bbc8" gracePeriod=2 Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.643696 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-xfb9q"] Jun 13 05:04:05 crc kubenswrapper[4894]: I0613 05:04:05.648638 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vb6p4" event={"ID":"138962ec-89d1-4771-adad-e9a0d910e80b","Type":"ContainerStarted","Data":"4875baa57bbcf7a70e1e6dfdd92b718a5d1ac466b594fa92cfddb916d9bc8522"} Jun 13 05:04:06 crc kubenswrapper[4894]: W0613 05:04:06.108007 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d6bb95_a363_4ac9_8034_dff9e9642464.slice/crio-622ed26fea7925c1ab8bbcaeb2f6642b0bdcb7e7897207b249203546d12cd068 WatchSource:0}: Error finding container 622ed26fea7925c1ab8bbcaeb2f6642b0bdcb7e7897207b249203546d12cd068: Status 404 returned error can't find the container with id 622ed26fea7925c1ab8bbcaeb2f6642b0bdcb7e7897207b249203546d12cd068 Jun 13 05:04:06 crc kubenswrapper[4894]: I0613 05:04:06.288854 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362d8879-aff0-4e42-b7b8-e6c7afedec8e" path="/var/lib/kubelet/pods/362d8879-aff0-4e42-b7b8-e6c7afedec8e/volumes" Jun 13 05:04:06 crc kubenswrapper[4894]: I0613 05:04:06.658192 4894 generic.go:334] "Generic (PLEG): container finished" podID="be04e48a-577c-4d75-9a73-30081f8881ac" containerID="21d537595f98f9262b192099f62b0918125f6b28f6c7a2cd7f8e83eae326bbc8" exitCode=0 Jun 13 05:04:06 crc kubenswrapper[4894]: I0613 05:04:06.659403 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e3d6bb95-a363-4ac9-8034-dff9e9642464","Type":"ContainerStarted","Data":"622ed26fea7925c1ab8bbcaeb2f6642b0bdcb7e7897207b249203546d12cd068"} Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.227192 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xfb9q" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.299074 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host\") pod \"be04e48a-577c-4d75-9a73-30081f8881ac\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.299176 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmndf\" (UniqueName: \"kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf\") pod \"be04e48a-577c-4d75-9a73-30081f8881ac\" (UID: \"be04e48a-577c-4d75-9a73-30081f8881ac\") " Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.299548 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host" (OuterVolumeSpecName: "host") pod "be04e48a-577c-4d75-9a73-30081f8881ac" (UID: "be04e48a-577c-4d75-9a73-30081f8881ac"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.309292 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf" (OuterVolumeSpecName: "kube-api-access-cmndf") pod "be04e48a-577c-4d75-9a73-30081f8881ac" (UID: "be04e48a-577c-4d75-9a73-30081f8881ac"). InnerVolumeSpecName "kube-api-access-cmndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.400711 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmndf\" (UniqueName: \"kubernetes.io/projected/be04e48a-577c-4d75-9a73-30081f8881ac-kube-api-access-cmndf\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.400737 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be04e48a-577c-4d75-9a73-30081f8881ac-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.668746 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xfb9q" Jun 13 05:04:07 crc kubenswrapper[4894]: I0613 05:04:07.668766 4894 scope.go:117] "RemoveContainer" containerID="21d537595f98f9262b192099f62b0918125f6b28f6c7a2cd7f8e83eae326bbc8" Jun 13 05:04:08 crc kubenswrapper[4894]: I0613 05:04:08.287130 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be04e48a-577c-4d75-9a73-30081f8881ac" path="/var/lib/kubelet/pods/be04e48a-577c-4d75-9a73-30081f8881ac/volumes" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.726844 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5zm5p" event={"ID":"e715a67e-623b-4d05-8bc9-676747d445fb","Type":"ContainerStarted","Data":"334a1f5c95014fc617c59fcba2e1dec7421f70d040616c3823b6d21009c24872"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.727321 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5zm5p" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.728356 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5e584455-5537-425d-a454-063087cc3fea","Type":"ContainerStarted","Data":"252c3adbf15d63c10de9fcd1f6bca9ac5a58dbfc446d8bf531fbd566f20219e6"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.730487 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1bb10c27-9b94-43cd-82df-407e68605449","Type":"ContainerStarted","Data":"58d1832604addf25761ebe87bbf9c6f9d9f412a48fbd47b5d31b360861718683"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.730856 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.732001 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vb6p4" event={"ID":"138962ec-89d1-4771-adad-e9a0d910e80b","Type":"ContainerStarted","Data":"93ef6778f9825a03d3ec8d83a33b98ee494b0f146e432eb554d903a711ff44db"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.734113 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"138daa45-0563-4c44-8b99-9bfb66eea5c6","Type":"ContainerStarted","Data":"2278573eab36df258f1e07334574f06c4c7185582adfffd1344160b380e62adf"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.736276 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e3d6bb95-a363-4ac9-8034-dff9e9642464","Type":"ContainerStarted","Data":"ec4931b344822ab38fb9272783d4f6366f353e43906574bd4f7b44440a55388b"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.739874 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32","Type":"ContainerStarted","Data":"05a1bb0713bbf26b080a886cf891168b90779fe7a83775ec80c0e831574e6041"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.742620 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eaa87fe1-544c-4780-a350-acb43e14d346","Type":"ContainerStarted","Data":"7e03455584cb09199bc2f4a8d1aa180aebc9806ba69ee48f6c8d328382dc8639"} Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.743409 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.749070 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5zm5p" podStartSLOduration=13.465616814 podStartE2EDuration="21.749055521s" podCreationTimestamp="2025-06-13 05:03:51 +0000 UTC" firstStartedPulling="2025-06-13 05:04:03.759252052 +0000 UTC m=+802.205499515" lastFinishedPulling="2025-06-13 05:04:12.042690749 +0000 UTC m=+810.488938222" observedRunningTime="2025-06-13 05:04:12.745999425 +0000 UTC m=+811.192246888" watchObservedRunningTime="2025-06-13 05:04:12.749055521 +0000 UTC m=+811.195302984" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.791773 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.860150745 podStartE2EDuration="23.791754904s" podCreationTimestamp="2025-06-13 05:03:49 +0000 UTC" firstStartedPulling="2025-06-13 05:04:06.112235963 +0000 UTC m=+804.558483426" lastFinishedPulling="2025-06-13 05:04:12.043840121 +0000 UTC m=+810.490087585" observedRunningTime="2025-06-13 05:04:12.76180954 +0000 UTC m=+811.208057003" watchObservedRunningTime="2025-06-13 05:04:12.791754904 +0000 UTC m=+811.238002377" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.811476 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.27159544 podStartE2EDuration="19.811452968s" podCreationTimestamp="2025-06-13 05:03:53 +0000 UTC" firstStartedPulling="2025-06-13 05:04:04.518262036 +0000 UTC m=+802.964509499" lastFinishedPulling="2025-06-13 05:04:12.058119554 +0000 UTC m=+810.504367027" observedRunningTime="2025-06-13 05:04:12.810127801 +0000 UTC m=+811.256375264" watchObservedRunningTime="2025-06-13 05:04:12.811452968 +0000 UTC m=+811.257700431" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.843939 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.401184559 podStartE2EDuration="25.843916663s" podCreationTimestamp="2025-06-13 05:03:47 +0000 UTC" firstStartedPulling="2025-06-13 05:04:03.097812435 +0000 UTC m=+801.544059938" lastFinishedPulling="2025-06-13 05:04:11.540544579 +0000 UTC m=+809.986792042" observedRunningTime="2025-06-13 05:04:12.841724541 +0000 UTC m=+811.287972004" watchObservedRunningTime="2025-06-13 05:04:12.843916663 +0000 UTC m=+811.290164126" Jun 13 05:04:12 crc kubenswrapper[4894]: I0613 05:04:12.878946 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.702059433 podStartE2EDuration="26.878931249s" podCreationTimestamp="2025-06-13 05:03:46 +0000 UTC" firstStartedPulling="2025-06-13 05:04:03.728221488 +0000 UTC m=+802.174468951" lastFinishedPulling="2025-06-13 05:04:11.905093284 +0000 UTC m=+810.351340767" observedRunningTime="2025-06-13 05:04:12.872970691 +0000 UTC m=+811.319218154" watchObservedRunningTime="2025-06-13 05:04:12.878931249 +0000 UTC m=+811.325178712" Jun 13 05:04:13 crc kubenswrapper[4894]: I0613 05:04:13.336562 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jun 13 05:04:13 crc kubenswrapper[4894]: I0613 05:04:13.754586 4894 generic.go:334] "Generic (PLEG): container finished" podID="138962ec-89d1-4771-adad-e9a0d910e80b" containerID="93ef6778f9825a03d3ec8d83a33b98ee494b0f146e432eb554d903a711ff44db" exitCode=0 Jun 13 05:04:13 crc kubenswrapper[4894]: I0613 05:04:13.756935 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vb6p4" event={"ID":"138962ec-89d1-4771-adad-e9a0d910e80b","Type":"ContainerDied","Data":"93ef6778f9825a03d3ec8d83a33b98ee494b0f146e432eb554d903a711ff44db"} Jun 13 05:04:14 crc kubenswrapper[4894]: I0613 05:04:14.765134 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vb6p4" event={"ID":"138962ec-89d1-4771-adad-e9a0d910e80b","Type":"ContainerStarted","Data":"d0efe8c3e13e0bebf61797313e0f4e7acdb588704190aa56d430282799d8c76b"} Jun 13 05:04:14 crc kubenswrapper[4894]: I0613 05:04:14.765424 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vb6p4" event={"ID":"138962ec-89d1-4771-adad-e9a0d910e80b","Type":"ContainerStarted","Data":"7459a7df0be3998a1b20799f3933e298a02fb32a0094b6fd7453ce38e21ee36a"} Jun 13 05:04:14 crc kubenswrapper[4894]: I0613 05:04:14.766331 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:04:14 crc kubenswrapper[4894]: I0613 05:04:14.766359 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:04:14 crc kubenswrapper[4894]: I0613 05:04:14.794021 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vb6p4" podStartSLOduration=17.006734324 podStartE2EDuration="23.793996379s" podCreationTimestamp="2025-06-13 05:03:51 +0000 UTC" firstStartedPulling="2025-06-13 05:04:05.233508917 +0000 UTC m=+803.679756380" lastFinishedPulling="2025-06-13 05:04:12.020770932 +0000 UTC m=+810.467018435" observedRunningTime="2025-06-13 05:04:14.79049993 +0000 UTC m=+813.236747403" watchObservedRunningTime="2025-06-13 05:04:14.793996379 +0000 UTC m=+813.240243882" Jun 13 05:04:15 crc kubenswrapper[4894]: I0613 05:04:15.149897 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jun 13 05:04:15 crc kubenswrapper[4894]: I0613 05:04:15.223974 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jun 13 05:04:15 crc kubenswrapper[4894]: I0613 05:04:15.337056 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jun 13 05:04:15 crc kubenswrapper[4894]: I0613 05:04:15.774989 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jun 13 05:04:16 crc kubenswrapper[4894]: I0613 05:04:16.395739 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jun 13 05:04:16 crc kubenswrapper[4894]: I0613 05:04:16.787130 4894 generic.go:334] "Generic (PLEG): container finished" podID="5e584455-5537-425d-a454-063087cc3fea" containerID="252c3adbf15d63c10de9fcd1f6bca9ac5a58dbfc446d8bf531fbd566f20219e6" exitCode=0 Jun 13 05:04:16 crc kubenswrapper[4894]: I0613 05:04:16.787213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5e584455-5537-425d-a454-063087cc3fea","Type":"ContainerDied","Data":"252c3adbf15d63c10de9fcd1f6bca9ac5a58dbfc446d8bf531fbd566f20219e6"} Jun 13 05:04:16 crc kubenswrapper[4894]: I0613 05:04:16.790906 4894 generic.go:334] "Generic (PLEG): container finished" podID="138daa45-0563-4c44-8b99-9bfb66eea5c6" containerID="2278573eab36df258f1e07334574f06c4c7185582adfffd1344160b380e62adf" exitCode=0 Jun 13 05:04:16 crc kubenswrapper[4894]: I0613 05:04:16.792055 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"138daa45-0563-4c44-8b99-9bfb66eea5c6","Type":"ContainerDied","Data":"2278573eab36df258f1e07334574f06c4c7185582adfffd1344160b380e62adf"} Jun 13 05:04:17 crc kubenswrapper[4894]: I0613 05:04:17.144216 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jun 13 05:04:17 crc kubenswrapper[4894]: I0613 05:04:17.922786 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jun 13 05:04:18 crc kubenswrapper[4894]: I0613 05:04:18.807461 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5e584455-5537-425d-a454-063087cc3fea","Type":"ContainerStarted","Data":"97a3b79599e92293dca2ade3d93106732930a60becef7895db55890efbdbad10"} Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.818996 4894 generic.go:334] "Generic (PLEG): container finished" podID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerID="e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4" exitCode=0 Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.819096 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" event={"ID":"96f1392b-00f5-4ed6-8f88-0a7a79134e67","Type":"ContainerDied","Data":"e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4"} Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.824244 4894 generic.go:334] "Generic (PLEG): container finished" podID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerID="594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5" exitCode=0 Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.824301 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" event={"ID":"6af3e3f7-f02d-4b2c-8662-435448da1ca5","Type":"ContainerDied","Data":"594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5"} Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.827389 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"138daa45-0563-4c44-8b99-9bfb66eea5c6","Type":"ContainerStarted","Data":"0bafa1319cbd9d76a5aa3550485d5e56f0043c3ca819c348b6b2029c9baae5f0"} Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.914759 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.916020454 podStartE2EDuration="35.914744993s" podCreationTimestamp="2025-06-13 05:03:44 +0000 UTC" firstStartedPulling="2025-06-13 05:04:04.016952879 +0000 UTC m=+802.463200342" lastFinishedPulling="2025-06-13 05:04:12.015677408 +0000 UTC m=+810.461924881" observedRunningTime="2025-06-13 05:04:19.914549518 +0000 UTC m=+818.360796991" watchObservedRunningTime="2025-06-13 05:04:19.914744993 +0000 UTC m=+818.360992446" Jun 13 05:04:19 crc kubenswrapper[4894]: I0613 05:04:19.947436 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.906240538 podStartE2EDuration="34.947420053s" podCreationTimestamp="2025-06-13 05:03:45 +0000 UTC" firstStartedPulling="2025-06-13 05:04:04.015853308 +0000 UTC m=+802.462100761" lastFinishedPulling="2025-06-13 05:04:12.057032823 +0000 UTC m=+810.503280276" observedRunningTime="2025-06-13 05:04:19.945108578 +0000 UTC m=+818.391356041" watchObservedRunningTime="2025-06-13 05:04:19.947420053 +0000 UTC m=+818.393667516" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.407220 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.693370 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.725731 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:20 crc kubenswrapper[4894]: E0613 05:04:20.725998 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be04e48a-577c-4d75-9a73-30081f8881ac" containerName="container-00" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.726012 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="be04e48a-577c-4d75-9a73-30081f8881ac" containerName="container-00" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.726141 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="be04e48a-577c-4d75-9a73-30081f8881ac" containerName="container-00" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.726894 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.729507 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.771080 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.835060 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" event={"ID":"96f1392b-00f5-4ed6-8f88-0a7a79134e67","Type":"ContainerStarted","Data":"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff"} Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.836515 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" event={"ID":"6af3e3f7-f02d-4b2c-8662-435448da1ca5","Type":"ContainerStarted","Data":"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1"} Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.836714 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.836720 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="dnsmasq-dns" containerID="cri-o://5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1" gracePeriod=10 Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.847890 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmhsj\" (UniqueName: \"kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.847968 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.848029 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.848063 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.854465 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" podStartSLOduration=3.585496012 podStartE2EDuration="38.854450116s" podCreationTimestamp="2025-06-13 05:03:42 +0000 UTC" firstStartedPulling="2025-06-13 05:03:43.627674639 +0000 UTC m=+782.073922102" lastFinishedPulling="2025-06-13 05:04:18.896628733 +0000 UTC m=+817.342876206" observedRunningTime="2025-06-13 05:04:20.8492367 +0000 UTC m=+819.295484163" watchObservedRunningTime="2025-06-13 05:04:20.854450116 +0000 UTC m=+819.300697579" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.952564 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.952726 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.952772 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.952802 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmhsj\" (UniqueName: \"kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.953836 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.954926 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.955534 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.979588 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmhsj\" (UniqueName: \"kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj\") pod \"dnsmasq-dns-6479755b69-k46lw\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.989724 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" podStartSLOduration=3.351948486 podStartE2EDuration="38.989705905s" podCreationTimestamp="2025-06-13 05:03:42 +0000 UTC" firstStartedPulling="2025-06-13 05:03:43.486351119 +0000 UTC m=+781.932598572" lastFinishedPulling="2025-06-13 05:04:19.124108528 +0000 UTC m=+817.570355991" observedRunningTime="2025-06-13 05:04:20.875649963 +0000 UTC m=+819.321897426" watchObservedRunningTime="2025-06-13 05:04:20.989705905 +0000 UTC m=+819.435953368" Jun 13 05:04:20 crc kubenswrapper[4894]: I0613 05:04:20.994444 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.031807 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.032928 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.035510 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.052053 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.057861 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.159555 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx7t5\" (UniqueName: \"kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.159683 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.159707 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.159726 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.159762 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.203476 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.261467 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx7t5\" (UniqueName: \"kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.261552 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.261576 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.261593 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.261630 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.262456 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.263241 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.263701 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.273189 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.282592 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx7t5\" (UniqueName: \"kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5\") pod \"dnsmasq-dns-644b7d5b7f-qbgzj\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.345301 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.349243 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.393564 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-774dc9f9f-dcqbb"] Jun 13 05:04:21 crc kubenswrapper[4894]: E0613 05:04:21.393927 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="init" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.393943 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="init" Jun 13 05:04:21 crc kubenswrapper[4894]: E0613 05:04:21.393958 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="dnsmasq-dns" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.393964 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="dnsmasq-dns" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.394124 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerName="dnsmasq-dns" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.395343 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.398415 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vmkwl" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.398695 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.422133 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-774dc9f9f-dcqbb"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.472606 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnplk\" (UniqueName: \"kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk\") pod \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.472700 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc\") pod \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.472823 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config\") pod \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\" (UID: \"6af3e3f7-f02d-4b2c-8662-435448da1ca5\") " Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.473027 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-ovn-northd-tls-certs\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.473047 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gzd\" (UniqueName: \"kubernetes.io/projected/71f90c09-b2cc-4e1c-a18f-595a5efeb141-kube-api-access-w5gzd\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.473089 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-combined-ca-bundle\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.481178 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk" (OuterVolumeSpecName: "kube-api-access-rnplk") pod "6af3e3f7-f02d-4b2c-8662-435448da1ca5" (UID: "6af3e3f7-f02d-4b2c-8662-435448da1ca5"). InnerVolumeSpecName "kube-api-access-rnplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.530208 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6af3e3f7-f02d-4b2c-8662-435448da1ca5" (UID: "6af3e3f7-f02d-4b2c-8662-435448da1ca5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.530684 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config" (OuterVolumeSpecName: "config") pod "6af3e3f7-f02d-4b2c-8662-435448da1ca5" (UID: "6af3e3f7-f02d-4b2c-8662-435448da1ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575586 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-ovn-northd-tls-certs\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575636 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gzd\" (UniqueName: \"kubernetes.io/projected/71f90c09-b2cc-4e1c-a18f-595a5efeb141-kube-api-access-w5gzd\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575695 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-combined-ca-bundle\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575806 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnplk\" (UniqueName: \"kubernetes.io/projected/6af3e3f7-f02d-4b2c-8662-435448da1ca5-kube-api-access-rnplk\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575819 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.575828 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af3e3f7-f02d-4b2c-8662-435448da1ca5-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.579233 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-combined-ca-bundle\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.579622 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71f90c09-b2cc-4e1c-a18f-595a5efeb141-ovn-northd-tls-certs\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.594268 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gzd\" (UniqueName: \"kubernetes.io/projected/71f90c09-b2cc-4e1c-a18f-595a5efeb141-kube-api-access-w5gzd\") pod \"ovn-northd-774dc9f9f-dcqbb\" (UID: \"71f90c09-b2cc-4e1c-a18f-595a5efeb141\") " pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.700493 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:21 crc kubenswrapper[4894]: W0613 05:04:21.705195 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9fde432_6224_4aef_aa5b_157d7a4fb4e5.slice/crio-2a64c2a549bd1b063ef782a005a0cebdcf6e1f8be949eb4a670f03a624cf815d WatchSource:0}: Error finding container 2a64c2a549bd1b063ef782a005a0cebdcf6e1f8be949eb4a670f03a624cf815d: Status 404 returned error can't find the container with id 2a64c2a549bd1b063ef782a005a0cebdcf6e1f8be949eb4a670f03a624cf815d Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.718557 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.813292 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.844707 4894 generic.go:334] "Generic (PLEG): container finished" podID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" containerID="5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1" exitCode=0 Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.844879 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" event={"ID":"6af3e3f7-f02d-4b2c-8662-435448da1ca5","Type":"ContainerDied","Data":"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1"} Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.845021 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" event={"ID":"6af3e3f7-f02d-4b2c-8662-435448da1ca5","Type":"ContainerDied","Data":"65131294e9edcb0a7fc260e0eafac23b35ccd034ab90bb27e90332c0aa58d6b3"} Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.845040 4894 scope.go:117] "RemoveContainer" containerID="5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.845376 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5544c68b5-lbkpx" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.857782 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6479755b69-k46lw" event={"ID":"a9fde432-6224-4aef-aa5b-157d7a4fb4e5","Type":"ContainerStarted","Data":"2a64c2a549bd1b063ef782a005a0cebdcf6e1f8be949eb4a670f03a624cf815d"} Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.859475 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="dnsmasq-dns" containerID="cri-o://002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff" gracePeriod=10 Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.859703 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" event={"ID":"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4","Type":"ContainerStarted","Data":"c3394be169d66dd9b37e8d8bc11c5f5e68a155048ce73dd1fbee5f861d00aab8"} Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.859729 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.893072 4894 scope.go:117] "RemoveContainer" containerID="594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5" Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.896386 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:04:21 crc kubenswrapper[4894]: I0613 05:04:21.901572 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5544c68b5-lbkpx"] Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.057214 4894 scope.go:117] "RemoveContainer" containerID="5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1" Jun 13 05:04:22 crc kubenswrapper[4894]: E0613 05:04:22.058194 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1\": container with ID starting with 5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1 not found: ID does not exist" containerID="5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.058223 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1"} err="failed to get container status \"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1\": rpc error: code = NotFound desc = could not find container \"5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1\": container with ID starting with 5000ab4dfff88de5209d36049d76c18b97de761f18cb5b664c3c7284a8e67bf1 not found: ID does not exist" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.058246 4894 scope.go:117] "RemoveContainer" containerID="594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5" Jun 13 05:04:22 crc kubenswrapper[4894]: E0613 05:04:22.058561 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5\": container with ID starting with 594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5 not found: ID does not exist" containerID="594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.058584 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5"} err="failed to get container status \"594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5\": rpc error: code = NotFound desc = could not find container \"594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5\": container with ID starting with 594c043aafefb19f7d6887a4fee509ee0f72387cd44b33b91cd2f4d92a88eaa5 not found: ID does not exist" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.150465 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-774dc9f9f-dcqbb"] Jun 13 05:04:22 crc kubenswrapper[4894]: W0613 05:04:22.156672 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71f90c09_b2cc_4e1c_a18f_595a5efeb141.slice/crio-3be086b6da9e23336c67f13c1f43372505c932e7ba67b7f1b50289fe0d27f375 WatchSource:0}: Error finding container 3be086b6da9e23336c67f13c1f43372505c932e7ba67b7f1b50289fe0d27f375: Status 404 returned error can't find the container with id 3be086b6da9e23336c67f13c1f43372505c932e7ba67b7f1b50289fe0d27f375 Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.260416 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.286260 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af3e3f7-f02d-4b2c-8662-435448da1ca5" path="/var/lib/kubelet/pods/6af3e3f7-f02d-4b2c-8662-435448da1ca5/volumes" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.387900 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4drbw\" (UniqueName: \"kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw\") pod \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.388167 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config\") pod \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.388356 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc\") pod \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\" (UID: \"96f1392b-00f5-4ed6-8f88-0a7a79134e67\") " Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.416376 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw" (OuterVolumeSpecName: "kube-api-access-4drbw") pod "96f1392b-00f5-4ed6-8f88-0a7a79134e67" (UID: "96f1392b-00f5-4ed6-8f88-0a7a79134e67"). InnerVolumeSpecName "kube-api-access-4drbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.434877 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96f1392b-00f5-4ed6-8f88-0a7a79134e67" (UID: "96f1392b-00f5-4ed6-8f88-0a7a79134e67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.447666 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config" (OuterVolumeSpecName: "config") pod "96f1392b-00f5-4ed6-8f88-0a7a79134e67" (UID: "96f1392b-00f5-4ed6-8f88-0a7a79134e67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.492615 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.492692 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4drbw\" (UniqueName: \"kubernetes.io/projected/96f1392b-00f5-4ed6-8f88-0a7a79134e67-kube-api-access-4drbw\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.492715 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f1392b-00f5-4ed6-8f88-0a7a79134e67-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.867607 4894 generic.go:334] "Generic (PLEG): container finished" podID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerID="002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff" exitCode=0 Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.867730 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" event={"ID":"96f1392b-00f5-4ed6-8f88-0a7a79134e67","Type":"ContainerDied","Data":"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff"} Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.867923 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" event={"ID":"96f1392b-00f5-4ed6-8f88-0a7a79134e67","Type":"ContainerDied","Data":"012887b33bdfdedb0552038bc3b30eeb03ff965a1d441dc5cd24ba991ef46ffb"} Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.867740 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5db84f4f-r2x2r" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.867944 4894 scope.go:117] "RemoveContainer" containerID="002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.871457 4894 generic.go:334] "Generic (PLEG): container finished" podID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerID="000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65" exitCode=0 Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.871532 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6479755b69-k46lw" event={"ID":"a9fde432-6224-4aef-aa5b-157d7a4fb4e5","Type":"ContainerDied","Data":"000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65"} Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.875788 4894 generic.go:334] "Generic (PLEG): container finished" podID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerID="18d2a89e3e926ba407104219d31d641cfde7691e6293b2657d4398ec1ea41d09" exitCode=0 Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.875857 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" event={"ID":"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4","Type":"ContainerDied","Data":"18d2a89e3e926ba407104219d31d641cfde7691e6293b2657d4398ec1ea41d09"} Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.876821 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-774dc9f9f-dcqbb" event={"ID":"71f90c09-b2cc-4e1c-a18f-595a5efeb141","Type":"ContainerStarted","Data":"3be086b6da9e23336c67f13c1f43372505c932e7ba67b7f1b50289fe0d27f375"} Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.896746 4894 scope.go:117] "RemoveContainer" containerID="e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.932045 4894 scope.go:117] "RemoveContainer" containerID="002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff" Jun 13 05:04:22 crc kubenswrapper[4894]: E0613 05:04:22.933084 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff\": container with ID starting with 002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff not found: ID does not exist" containerID="002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.933123 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff"} err="failed to get container status \"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff\": rpc error: code = NotFound desc = could not find container \"002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff\": container with ID starting with 002f9033c50ab0981fdaa5a03ebf595b4ebe45433e648739a70984bd4bc589ff not found: ID does not exist" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.933148 4894 scope.go:117] "RemoveContainer" containerID="e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4" Jun 13 05:04:22 crc kubenswrapper[4894]: E0613 05:04:22.936618 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4\": container with ID starting with e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4 not found: ID does not exist" containerID="e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.936661 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4"} err="failed to get container status \"e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4\": rpc error: code = NotFound desc = could not find container \"e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4\": container with ID starting with e6704fc79bb51284e48982bffc9020979618597a967174a4dc6669a1fbbeaad4 not found: ID does not exist" Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.939158 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:04:22 crc kubenswrapper[4894]: I0613 05:04:22.946688 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5db84f4f-r2x2r"] Jun 13 05:04:22 crc kubenswrapper[4894]: E0613 05:04:22.986630 4894 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:59650->38.102.83.213:40951: write tcp 38.102.83.213:59650->38.102.83.213:40951: write: broken pipe Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.886751 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6479755b69-k46lw" event={"ID":"a9fde432-6224-4aef-aa5b-157d7a4fb4e5","Type":"ContainerStarted","Data":"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e"} Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.887096 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.888641 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" event={"ID":"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4","Type":"ContainerStarted","Data":"bcfe7bf751fbc4cc225a82fc48252796d7312c544975658459b9964aecf1b2a2"} Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.890080 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-774dc9f9f-dcqbb" event={"ID":"71f90c09-b2cc-4e1c-a18f-595a5efeb141","Type":"ContainerStarted","Data":"40ae430f175d5ca4ccee93b5f9143d04cec122e71c70b230a5e5032340c3f2e3"} Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.890236 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.919325 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6479755b69-k46lw" podStartSLOduration=3.919308185 podStartE2EDuration="3.919308185s" podCreationTimestamp="2025-06-13 05:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:04:23.911263108 +0000 UTC m=+822.357510571" watchObservedRunningTime="2025-06-13 05:04:23.919308185 +0000 UTC m=+822.365555658" Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.933305 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" podStartSLOduration=2.933288279 podStartE2EDuration="2.933288279s" podCreationTimestamp="2025-06-13 05:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:04:23.928227706 +0000 UTC m=+822.374475169" watchObservedRunningTime="2025-06-13 05:04:23.933288279 +0000 UTC m=+822.379535742" Jun 13 05:04:23 crc kubenswrapper[4894]: I0613 05:04:23.950907 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-774dc9f9f-dcqbb" podStartSLOduration=1.785312882 podStartE2EDuration="2.950889484s" podCreationTimestamp="2025-06-13 05:04:21 +0000 UTC" firstStartedPulling="2025-06-13 05:04:22.158952573 +0000 UTC m=+820.605200036" lastFinishedPulling="2025-06-13 05:04:23.324529175 +0000 UTC m=+821.770776638" observedRunningTime="2025-06-13 05:04:23.946734937 +0000 UTC m=+822.392982440" watchObservedRunningTime="2025-06-13 05:04:23.950889484 +0000 UTC m=+822.397136947" Jun 13 05:04:24 crc kubenswrapper[4894]: I0613 05:04:24.286218 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" path="/var/lib/kubelet/pods/96f1392b-00f5-4ed6-8f88-0a7a79134e67/volumes" Jun 13 05:04:24 crc kubenswrapper[4894]: I0613 05:04:24.921642 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.101675 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.102200 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.162244 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.540839 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.540901 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.611737 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jun 13 05:04:26 crc kubenswrapper[4894]: I0613 05:04:26.992436 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.005900 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.438419 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wnhfr"] Jun 13 05:04:27 crc kubenswrapper[4894]: E0613 05:04:27.438696 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="init" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.438708 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="init" Jun 13 05:04:27 crc kubenswrapper[4894]: E0613 05:04:27.438745 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="dnsmasq-dns" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.438751 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="dnsmasq-dns" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.438888 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f1392b-00f5-4ed6-8f88-0a7a79134e67" containerName="dnsmasq-dns" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.439362 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.448727 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wnhfr"] Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.574335 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52m9\" (UniqueName: \"kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9\") pod \"glance-db-create-wnhfr\" (UID: \"7575b610-4439-4ce2-bcc0-1d52e1ab719f\") " pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.676250 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52m9\" (UniqueName: \"kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9\") pod \"glance-db-create-wnhfr\" (UID: \"7575b610-4439-4ce2-bcc0-1d52e1ab719f\") " pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.699519 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52m9\" (UniqueName: \"kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9\") pod \"glance-db-create-wnhfr\" (UID: \"7575b610-4439-4ce2-bcc0-1d52e1ab719f\") " pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:27 crc kubenswrapper[4894]: I0613 05:04:27.755248 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:28 crc kubenswrapper[4894]: I0613 05:04:28.203526 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wnhfr"] Jun 13 05:04:28 crc kubenswrapper[4894]: I0613 05:04:28.968285 4894 generic.go:334] "Generic (PLEG): container finished" podID="7575b610-4439-4ce2-bcc0-1d52e1ab719f" containerID="fca5074640aab5c4d09b03510b18cb4e62c9d0f174cd847b9600ce890d87eaaa" exitCode=0 Jun 13 05:04:28 crc kubenswrapper[4894]: I0613 05:04:28.968510 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wnhfr" event={"ID":"7575b610-4439-4ce2-bcc0-1d52e1ab719f","Type":"ContainerDied","Data":"fca5074640aab5c4d09b03510b18cb4e62c9d0f174cd847b9600ce890d87eaaa"} Jun 13 05:04:28 crc kubenswrapper[4894]: I0613 05:04:28.968536 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wnhfr" event={"ID":"7575b610-4439-4ce2-bcc0-1d52e1ab719f","Type":"ContainerStarted","Data":"ae7634f90107005b1bb47313615bc5cffaa101018fc20b92cc611bb763d0de6a"} Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.352763 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.417175 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52m9\" (UniqueName: \"kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9\") pod \"7575b610-4439-4ce2-bcc0-1d52e1ab719f\" (UID: \"7575b610-4439-4ce2-bcc0-1d52e1ab719f\") " Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.421736 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9" (OuterVolumeSpecName: "kube-api-access-s52m9") pod "7575b610-4439-4ce2-bcc0-1d52e1ab719f" (UID: "7575b610-4439-4ce2-bcc0-1d52e1ab719f"). InnerVolumeSpecName "kube-api-access-s52m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.519287 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52m9\" (UniqueName: \"kubernetes.io/projected/7575b610-4439-4ce2-bcc0-1d52e1ab719f-kube-api-access-s52m9\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.984032 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wnhfr" event={"ID":"7575b610-4439-4ce2-bcc0-1d52e1ab719f","Type":"ContainerDied","Data":"ae7634f90107005b1bb47313615bc5cffaa101018fc20b92cc611bb763d0de6a"} Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.984068 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7634f90107005b1bb47313615bc5cffaa101018fc20b92cc611bb763d0de6a" Jun 13 05:04:30 crc kubenswrapper[4894]: I0613 05:04:30.984076 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wnhfr" Jun 13 05:04:31 crc kubenswrapper[4894]: I0613 05:04:31.053896 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:31 crc kubenswrapper[4894]: I0613 05:04:31.350963 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:04:31 crc kubenswrapper[4894]: I0613 05:04:31.422668 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:31 crc kubenswrapper[4894]: I0613 05:04:31.757767 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-774dc9f9f-dcqbb" Jun 13 05:04:31 crc kubenswrapper[4894]: I0613 05:04:31.990394 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6479755b69-k46lw" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="dnsmasq-dns" containerID="cri-o://ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e" gracePeriod=10 Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.487672 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.552183 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmhsj\" (UniqueName: \"kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj\") pod \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.552242 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb\") pod \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.552270 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc\") pod \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.552328 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config\") pod \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\" (UID: \"a9fde432-6224-4aef-aa5b-157d7a4fb4e5\") " Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.568280 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj" (OuterVolumeSpecName: "kube-api-access-fmhsj") pod "a9fde432-6224-4aef-aa5b-157d7a4fb4e5" (UID: "a9fde432-6224-4aef-aa5b-157d7a4fb4e5"). InnerVolumeSpecName "kube-api-access-fmhsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.590301 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9fde432-6224-4aef-aa5b-157d7a4fb4e5" (UID: "a9fde432-6224-4aef-aa5b-157d7a4fb4e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.595425 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9fde432-6224-4aef-aa5b-157d7a4fb4e5" (UID: "a9fde432-6224-4aef-aa5b-157d7a4fb4e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.595579 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config" (OuterVolumeSpecName: "config") pod "a9fde432-6224-4aef-aa5b-157d7a4fb4e5" (UID: "a9fde432-6224-4aef-aa5b-157d7a4fb4e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.658172 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmhsj\" (UniqueName: \"kubernetes.io/projected/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-kube-api-access-fmhsj\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.658203 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.658213 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:32 crc kubenswrapper[4894]: I0613 05:04:32.658223 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fde432-6224-4aef-aa5b-157d7a4fb4e5-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:32.999975 4894 generic.go:334] "Generic (PLEG): container finished" podID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerID="ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e" exitCode=0 Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.000015 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6479755b69-k46lw" event={"ID":"a9fde432-6224-4aef-aa5b-157d7a4fb4e5","Type":"ContainerDied","Data":"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e"} Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.000041 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6479755b69-k46lw" event={"ID":"a9fde432-6224-4aef-aa5b-157d7a4fb4e5","Type":"ContainerDied","Data":"2a64c2a549bd1b063ef782a005a0cebdcf6e1f8be949eb4a670f03a624cf815d"} Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.000058 4894 scope.go:117] "RemoveContainer" containerID="ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.000168 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6479755b69-k46lw" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.040534 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.052767 4894 scope.go:117] "RemoveContainer" containerID="000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.062399 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6479755b69-k46lw"] Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.091451 4894 scope.go:117] "RemoveContainer" containerID="ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e" Jun 13 05:04:33 crc kubenswrapper[4894]: E0613 05:04:33.091931 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e\": container with ID starting with ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e not found: ID does not exist" containerID="ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.091961 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e"} err="failed to get container status \"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e\": rpc error: code = NotFound desc = could not find container \"ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e\": container with ID starting with ff038be6696c2cfd1e40df61d97cdeda06a8747bb2e988b43e579fa7dce6f98e not found: ID does not exist" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.091985 4894 scope.go:117] "RemoveContainer" containerID="000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65" Jun 13 05:04:33 crc kubenswrapper[4894]: E0613 05:04:33.092467 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65\": container with ID starting with 000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65 not found: ID does not exist" containerID="000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65" Jun 13 05:04:33 crc kubenswrapper[4894]: I0613 05:04:33.092488 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65"} err="failed to get container status \"000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65\": rpc error: code = NotFound desc = could not find container \"000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65\": container with ID starting with 000ef1eb36024aa219656680b593031818509349b93d1f53e27d246147116d65 not found: ID does not exist" Jun 13 05:04:34 crc kubenswrapper[4894]: I0613 05:04:34.013644 4894 generic.go:334] "Generic (PLEG): container finished" podID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerID="039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae" exitCode=0 Jun 13 05:04:34 crc kubenswrapper[4894]: I0613 05:04:34.013714 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerDied","Data":"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae"} Jun 13 05:04:34 crc kubenswrapper[4894]: I0613 05:04:34.019759 4894 generic.go:334] "Generic (PLEG): container finished" podID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerID="597c737a6ceae50c71e16da94b68f9fae3b11b7790de42f254bc4ad65fbd9c53" exitCode=0 Jun 13 05:04:34 crc kubenswrapper[4894]: I0613 05:04:34.019808 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerDied","Data":"597c737a6ceae50c71e16da94b68f9fae3b11b7790de42f254bc4ad65fbd9c53"} Jun 13 05:04:34 crc kubenswrapper[4894]: I0613 05:04:34.288730 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" path="/var/lib/kubelet/pods/a9fde432-6224-4aef-aa5b-157d7a4fb4e5/volumes" Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.031513 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerStarted","Data":"1777129e8e8a33beb690e0c487643173eccbbdbb009a127f68facceba9c56f78"} Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.032038 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.034835 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerStarted","Data":"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58"} Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.035028 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.060361 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.165800928 podStartE2EDuration="53.060342485s" podCreationTimestamp="2025-06-13 05:03:42 +0000 UTC" firstStartedPulling="2025-06-13 05:03:44.377047532 +0000 UTC m=+782.823294995" lastFinishedPulling="2025-06-13 05:04:03.271589089 +0000 UTC m=+801.717836552" observedRunningTime="2025-06-13 05:04:35.055690544 +0000 UTC m=+833.501938007" watchObservedRunningTime="2025-06-13 05:04:35.060342485 +0000 UTC m=+833.506589968" Jun 13 05:04:35 crc kubenswrapper[4894]: I0613 05:04:35.107418 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.601834543 podStartE2EDuration="53.107398561s" podCreationTimestamp="2025-06-13 05:03:42 +0000 UTC" firstStartedPulling="2025-06-13 05:03:51.752033507 +0000 UTC m=+790.198280970" lastFinishedPulling="2025-06-13 05:04:03.257597525 +0000 UTC m=+801.703844988" observedRunningTime="2025-06-13 05:04:35.09601251 +0000 UTC m=+833.542259973" watchObservedRunningTime="2025-06-13 05:04:35.107398561 +0000 UTC m=+833.553646024" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.697798 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2trmz"] Jun 13 05:04:36 crc kubenswrapper[4894]: E0613 05:04:36.698141 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7575b610-4439-4ce2-bcc0-1d52e1ab719f" containerName="mariadb-database-create" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.698398 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7575b610-4439-4ce2-bcc0-1d52e1ab719f" containerName="mariadb-database-create" Jun 13 05:04:36 crc kubenswrapper[4894]: E0613 05:04:36.698421 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="dnsmasq-dns" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.698430 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="dnsmasq-dns" Jun 13 05:04:36 crc kubenswrapper[4894]: E0613 05:04:36.698449 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="init" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.698458 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="init" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.698673 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fde432-6224-4aef-aa5b-157d7a4fb4e5" containerName="dnsmasq-dns" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.698698 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7575b610-4439-4ce2-bcc0-1d52e1ab719f" containerName="mariadb-database-create" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.699260 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.715366 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2trmz"] Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.820186 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlprr\" (UniqueName: \"kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr\") pod \"keystone-db-create-2trmz\" (UID: \"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4\") " pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.921255 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlprr\" (UniqueName: \"kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr\") pod \"keystone-db-create-2trmz\" (UID: \"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4\") " pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:36 crc kubenswrapper[4894]: I0613 05:04:36.955795 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlprr\" (UniqueName: \"kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr\") pod \"keystone-db-create-2trmz\" (UID: \"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4\") " pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.013733 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.074774 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wtp2n"] Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.075636 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.085823 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wtp2n"] Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.225409 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdfzz\" (UniqueName: \"kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz\") pod \"placement-db-create-wtp2n\" (UID: \"1db94294-9551-4d80-8c50-5ac61b3343bf\") " pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.326514 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdfzz\" (UniqueName: \"kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz\") pod \"placement-db-create-wtp2n\" (UID: \"1db94294-9551-4d80-8c50-5ac61b3343bf\") " pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.394466 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdfzz\" (UniqueName: \"kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz\") pod \"placement-db-create-wtp2n\" (UID: \"1db94294-9551-4d80-8c50-5ac61b3343bf\") " pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.432004 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.455313 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0696-account-create-28h8v"] Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.456385 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.461194 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.473562 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0696-account-create-28h8v"] Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.505122 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2trmz"] Jun 13 05:04:37 crc kubenswrapper[4894]: W0613 05:04:37.530525 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod565bfe30_bbf9_4eb4_b308_c2bcff8ecbf4.slice/crio-af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86 WatchSource:0}: Error finding container af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86: Status 404 returned error can't find the container with id af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86 Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.633719 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76g2x\" (UniqueName: \"kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x\") pod \"glance-0696-account-create-28h8v\" (UID: \"ee825978-f532-48e8-aeca-7f6a21fd1625\") " pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.735841 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76g2x\" (UniqueName: \"kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x\") pod \"glance-0696-account-create-28h8v\" (UID: \"ee825978-f532-48e8-aeca-7f6a21fd1625\") " pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.752152 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wtp2n"] Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.757542 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76g2x\" (UniqueName: \"kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x\") pod \"glance-0696-account-create-28h8v\" (UID: \"ee825978-f532-48e8-aeca-7f6a21fd1625\") " pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:37 crc kubenswrapper[4894]: I0613 05:04:37.817504 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.052987 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0696-account-create-28h8v"] Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.071135 4894 generic.go:334] "Generic (PLEG): container finished" podID="1db94294-9551-4d80-8c50-5ac61b3343bf" containerID="902880520ce546f18d9dc06e3f3752e078fb2863a89a1462ee1ca5549ca5d14a" exitCode=0 Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.071215 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wtp2n" event={"ID":"1db94294-9551-4d80-8c50-5ac61b3343bf","Type":"ContainerDied","Data":"902880520ce546f18d9dc06e3f3752e078fb2863a89a1462ee1ca5549ca5d14a"} Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.071263 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wtp2n" event={"ID":"1db94294-9551-4d80-8c50-5ac61b3343bf","Type":"ContainerStarted","Data":"baab1d9032ffcb20bd21ea3eae58b41b79405b7da8cc4d84f5f07d6372d1e2cf"} Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.075020 4894 generic.go:334] "Generic (PLEG): container finished" podID="565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" containerID="d0166d765c558ebdb03a2f6737fcb71014fddf7d2b733b9534f80e857a0b9130" exitCode=0 Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.075065 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2trmz" event={"ID":"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4","Type":"ContainerDied","Data":"d0166d765c558ebdb03a2f6737fcb71014fddf7d2b733b9534f80e857a0b9130"} Jun 13 05:04:38 crc kubenswrapper[4894]: I0613 05:04:38.075095 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2trmz" event={"ID":"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4","Type":"ContainerStarted","Data":"af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86"} Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.088028 4894 generic.go:334] "Generic (PLEG): container finished" podID="ee825978-f532-48e8-aeca-7f6a21fd1625" containerID="e0a1e8d9d2c43aedc8bee2c1ec1db47dec62fe0d6c3acc4d1373ed8fa1991371" exitCode=0 Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.088140 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0696-account-create-28h8v" event={"ID":"ee825978-f532-48e8-aeca-7f6a21fd1625","Type":"ContainerDied","Data":"e0a1e8d9d2c43aedc8bee2c1ec1db47dec62fe0d6c3acc4d1373ed8fa1991371"} Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.088563 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0696-account-create-28h8v" event={"ID":"ee825978-f532-48e8-aeca-7f6a21fd1625","Type":"ContainerStarted","Data":"928f0d21e8ce2ac181bae2d3c02efc9ecd0fca55dfb593706050722fdb1fcaef"} Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.530319 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.535982 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.665719 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdfzz\" (UniqueName: \"kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz\") pod \"1db94294-9551-4d80-8c50-5ac61b3343bf\" (UID: \"1db94294-9551-4d80-8c50-5ac61b3343bf\") " Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.665907 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlprr\" (UniqueName: \"kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr\") pod \"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4\" (UID: \"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4\") " Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.672392 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz" (OuterVolumeSpecName: "kube-api-access-jdfzz") pod "1db94294-9551-4d80-8c50-5ac61b3343bf" (UID: "1db94294-9551-4d80-8c50-5ac61b3343bf"). InnerVolumeSpecName "kube-api-access-jdfzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.678788 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr" (OuterVolumeSpecName: "kube-api-access-hlprr") pod "565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" (UID: "565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4"). InnerVolumeSpecName "kube-api-access-hlprr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.767556 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlprr\" (UniqueName: \"kubernetes.io/projected/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4-kube-api-access-hlprr\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:39 crc kubenswrapper[4894]: I0613 05:04:39.767598 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdfzz\" (UniqueName: \"kubernetes.io/projected/1db94294-9551-4d80-8c50-5ac61b3343bf-kube-api-access-jdfzz\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.096454 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wtp2n" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.096467 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wtp2n" event={"ID":"1db94294-9551-4d80-8c50-5ac61b3343bf","Type":"ContainerDied","Data":"baab1d9032ffcb20bd21ea3eae58b41b79405b7da8cc4d84f5f07d6372d1e2cf"} Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.096522 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baab1d9032ffcb20bd21ea3eae58b41b79405b7da8cc4d84f5f07d6372d1e2cf" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.097974 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2trmz" event={"ID":"565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4","Type":"ContainerDied","Data":"af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86"} Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.098081 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af63eca80206e85260b0d80855330226e007729d2f75125d4bea4a02e44a7a86" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.098158 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2trmz" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.434770 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.592343 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76g2x\" (UniqueName: \"kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x\") pod \"ee825978-f532-48e8-aeca-7f6a21fd1625\" (UID: \"ee825978-f532-48e8-aeca-7f6a21fd1625\") " Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.612799 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x" (OuterVolumeSpecName: "kube-api-access-76g2x") pod "ee825978-f532-48e8-aeca-7f6a21fd1625" (UID: "ee825978-f532-48e8-aeca-7f6a21fd1625"). InnerVolumeSpecName "kube-api-access-76g2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:40 crc kubenswrapper[4894]: I0613 05:04:40.694706 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76g2x\" (UniqueName: \"kubernetes.io/projected/ee825978-f532-48e8-aeca-7f6a21fd1625-kube-api-access-76g2x\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:41 crc kubenswrapper[4894]: I0613 05:04:41.112238 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0696-account-create-28h8v" event={"ID":"ee825978-f532-48e8-aeca-7f6a21fd1625","Type":"ContainerDied","Data":"928f0d21e8ce2ac181bae2d3c02efc9ecd0fca55dfb593706050722fdb1fcaef"} Jun 13 05:04:41 crc kubenswrapper[4894]: I0613 05:04:41.112283 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="928f0d21e8ce2ac181bae2d3c02efc9ecd0fca55dfb593706050722fdb1fcaef" Jun 13 05:04:41 crc kubenswrapper[4894]: I0613 05:04:41.112345 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0696-account-create-28h8v" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.199286 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5zm5p" podUID="e715a67e-623b-4d05-8bc9-676747d445fb" containerName="ovn-controller" probeResult="failure" output=< Jun 13 05:04:42 crc kubenswrapper[4894]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jun 13 05:04:42 crc kubenswrapper[4894]: > Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640101 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-t6wgt"] Jun 13 05:04:42 crc kubenswrapper[4894]: E0613 05:04:42.640369 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee825978-f532-48e8-aeca-7f6a21fd1625" containerName="mariadb-account-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640381 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee825978-f532-48e8-aeca-7f6a21fd1625" containerName="mariadb-account-create" Jun 13 05:04:42 crc kubenswrapper[4894]: E0613 05:04:42.640403 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db94294-9551-4d80-8c50-5ac61b3343bf" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640409 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db94294-9551-4d80-8c50-5ac61b3343bf" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: E0613 05:04:42.640430 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640436 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640593 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db94294-9551-4d80-8c50-5ac61b3343bf" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640602 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee825978-f532-48e8-aeca-7f6a21fd1625" containerName="mariadb-account-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.640614 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" containerName="mariadb-database-create" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.641133 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.647879 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-snrch" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.648070 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.660383 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t6wgt"] Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.825412 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.825455 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.825492 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.825521 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkqj\" (UniqueName: \"kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.927451 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.927790 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.927872 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.928392 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkqj\" (UniqueName: \"kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.932565 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.932596 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.933319 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:42 crc kubenswrapper[4894]: I0613 05:04:42.944615 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkqj\" (UniqueName: \"kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj\") pod \"glance-db-sync-t6wgt\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:43 crc kubenswrapper[4894]: I0613 05:04:43.005346 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6wgt" Jun 13 05:04:43 crc kubenswrapper[4894]: I0613 05:04:43.609474 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t6wgt"] Jun 13 05:04:44 crc kubenswrapper[4894]: I0613 05:04:44.152077 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6wgt" event={"ID":"9908123d-bc70-4017-953a-8f0a082f2726","Type":"ContainerStarted","Data":"c82e072a6fdd6fd93e265cdc172674367c3cd6775720f01a9632da0cee93605c"} Jun 13 05:04:44 crc kubenswrapper[4894]: I0613 05:04:44.327528 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:04:46 crc kubenswrapper[4894]: I0613 05:04:46.853556 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c1f1-account-create-xrwwd"] Jun 13 05:04:46 crc kubenswrapper[4894]: I0613 05:04:46.855195 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:46 crc kubenswrapper[4894]: I0613 05:04:46.858169 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jun 13 05:04:46 crc kubenswrapper[4894]: I0613 05:04:46.861830 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c1f1-account-create-xrwwd"] Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.019353 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhn5d\" (UniqueName: \"kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d\") pod \"keystone-c1f1-account-create-xrwwd\" (UID: \"5aba867f-c501-4477-bbc6-9d713f2d2b13\") " pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.122131 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhn5d\" (UniqueName: \"kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d\") pod \"keystone-c1f1-account-create-xrwwd\" (UID: \"5aba867f-c501-4477-bbc6-9d713f2d2b13\") " pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.142998 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d6e4-account-create-9tvc8"] Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.148128 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.150757 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.153032 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6e4-account-create-9tvc8"] Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.171719 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhn5d\" (UniqueName: \"kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d\") pod \"keystone-c1f1-account-create-xrwwd\" (UID: \"5aba867f-c501-4477-bbc6-9d713f2d2b13\") " pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.182231 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.200818 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5zm5p" podUID="e715a67e-623b-4d05-8bc9-676747d445fb" containerName="ovn-controller" probeResult="failure" output=< Jun 13 05:04:47 crc kubenswrapper[4894]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jun 13 05:04:47 crc kubenswrapper[4894]: > Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.265681 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.282439 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vb6p4" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.328691 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpzr\" (UniqueName: \"kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr\") pod \"placement-d6e4-account-create-9tvc8\" (UID: \"cb05c85b-3440-4c78-b64a-9e950da85ed9\") " pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.430892 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpzr\" (UniqueName: \"kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr\") pod \"placement-d6e4-account-create-9tvc8\" (UID: \"cb05c85b-3440-4c78-b64a-9e950da85ed9\") " pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.458980 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5zm5p-config-mg5zh"] Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.459846 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.460862 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpzr\" (UniqueName: \"kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr\") pod \"placement-d6e4-account-create-9tvc8\" (UID: \"cb05c85b-3440-4c78-b64a-9e950da85ed9\") " pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.484394 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5zm5p-config-mg5zh"] Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.511588 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.634493 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.634528 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcv45\" (UniqueName: \"kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.634554 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.634581 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.634848 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.641897 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c1f1-account-create-xrwwd"] Jun 13 05:04:47 crc kubenswrapper[4894]: W0613 05:04:47.649406 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aba867f_c501_4477_bbc6_9d713f2d2b13.slice/crio-eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa WatchSource:0}: Error finding container eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa: Status 404 returned error can't find the container with id eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736583 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736698 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736716 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcv45\" (UniqueName: \"kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736734 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736757 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.736991 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.737044 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.738393 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.738830 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.753231 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcv45\" (UniqueName: \"kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45\") pod \"ovn-controller-5zm5p-config-mg5zh\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.869409 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:47 crc kubenswrapper[4894]: I0613 05:04:47.946664 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6e4-account-create-9tvc8"] Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.193864 4894 generic.go:334] "Generic (PLEG): container finished" podID="5aba867f-c501-4477-bbc6-9d713f2d2b13" containerID="2d1d3dd87ffd559c91fe9fc1c601a987a3d5aa7dc866b76ad24953701b8af19c" exitCode=0 Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.193928 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1f1-account-create-xrwwd" event={"ID":"5aba867f-c501-4477-bbc6-9d713f2d2b13","Type":"ContainerDied","Data":"2d1d3dd87ffd559c91fe9fc1c601a987a3d5aa7dc866b76ad24953701b8af19c"} Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.193959 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1f1-account-create-xrwwd" event={"ID":"5aba867f-c501-4477-bbc6-9d713f2d2b13","Type":"ContainerStarted","Data":"eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa"} Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.197968 4894 generic.go:334] "Generic (PLEG): container finished" podID="cb05c85b-3440-4c78-b64a-9e950da85ed9" containerID="1d26a1188086b7a5561b6fcf3240baf9b9cb54329fc1d67c6aca6cbbc409fa38" exitCode=0 Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.198813 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6e4-account-create-9tvc8" event={"ID":"cb05c85b-3440-4c78-b64a-9e950da85ed9","Type":"ContainerDied","Data":"1d26a1188086b7a5561b6fcf3240baf9b9cb54329fc1d67c6aca6cbbc409fa38"} Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.198836 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6e4-account-create-9tvc8" event={"ID":"cb05c85b-3440-4c78-b64a-9e950da85ed9","Type":"ContainerStarted","Data":"833aad7e111f47ebb105d03c4158dd35ab09c2ef3bbaea88b4d1226287514fb0"} Jun 13 05:04:48 crc kubenswrapper[4894]: I0613 05:04:48.286797 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5zm5p-config-mg5zh"] Jun 13 05:04:48 crc kubenswrapper[4894]: W0613 05:04:48.290853 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdbd2ee6_dba1_47e5_b01e_fb168e861e41.slice/crio-11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0 WatchSource:0}: Error finding container 11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0: Status 404 returned error can't find the container with id 11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0 Jun 13 05:04:49 crc kubenswrapper[4894]: I0613 05:04:49.209685 4894 generic.go:334] "Generic (PLEG): container finished" podID="fdbd2ee6-dba1-47e5-b01e-fb168e861e41" containerID="c39332d60ca356dea2df67b7e64412f3b8d72505e36188b7f690e8ce86664439" exitCode=0 Jun 13 05:04:49 crc kubenswrapper[4894]: I0613 05:04:49.209756 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5zm5p-config-mg5zh" event={"ID":"fdbd2ee6-dba1-47e5-b01e-fb168e861e41","Type":"ContainerDied","Data":"c39332d60ca356dea2df67b7e64412f3b8d72505e36188b7f690e8ce86664439"} Jun 13 05:04:49 crc kubenswrapper[4894]: I0613 05:04:49.209782 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5zm5p-config-mg5zh" event={"ID":"fdbd2ee6-dba1-47e5-b01e-fb168e861e41","Type":"ContainerStarted","Data":"11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0"} Jun 13 05:04:52 crc kubenswrapper[4894]: I0613 05:04:52.180418 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5zm5p" Jun 13 05:04:53 crc kubenswrapper[4894]: I0613 05:04:53.839080 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.135473 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-skf6h"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.136933 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.152981 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-skf6h"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.244101 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4c6w7"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.245179 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.273379 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4c6w7"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.301276 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kldk\" (UniqueName: \"kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk\") pod \"cinder-db-create-skf6h\" (UID: \"48af4afd-782d-44df-a045-53a21dc75744\") " pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.341835 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-k5mrc"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.342772 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.350983 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k5mrc"] Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.402528 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kldk\" (UniqueName: \"kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk\") pod \"cinder-db-create-skf6h\" (UID: \"48af4afd-782d-44df-a045-53a21dc75744\") " pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.402602 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsnl\" (UniqueName: \"kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl\") pod \"barbican-db-create-4c6w7\" (UID: \"ab77546b-abcf-47cc-88ab-1d11d45d837d\") " pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.430319 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kldk\" (UniqueName: \"kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk\") pod \"cinder-db-create-skf6h\" (UID: \"48af4afd-782d-44df-a045-53a21dc75744\") " pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.456222 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.503925 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4fjq\" (UniqueName: \"kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq\") pod \"neutron-db-create-k5mrc\" (UID: \"375c390c-32f8-4a62-84df-ce789ec5a118\") " pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.503983 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsnl\" (UniqueName: \"kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl\") pod \"barbican-db-create-4c6w7\" (UID: \"ab77546b-abcf-47cc-88ab-1d11d45d837d\") " pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.536231 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsnl\" (UniqueName: \"kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl\") pod \"barbican-db-create-4c6w7\" (UID: \"ab77546b-abcf-47cc-88ab-1d11d45d837d\") " pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.558998 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.606182 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4fjq\" (UniqueName: \"kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq\") pod \"neutron-db-create-k5mrc\" (UID: \"375c390c-32f8-4a62-84df-ce789ec5a118\") " pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.630159 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4fjq\" (UniqueName: \"kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq\") pod \"neutron-db-create-k5mrc\" (UID: \"375c390c-32f8-4a62-84df-ce789ec5a118\") " pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:54 crc kubenswrapper[4894]: I0613 05:04:54.656599 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.723116 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.727146 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.733544 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842434 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn\") pod \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842478 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhn5d\" (UniqueName: \"kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d\") pod \"5aba867f-c501-4477-bbc6-9d713f2d2b13\" (UID: \"5aba867f-c501-4477-bbc6-9d713f2d2b13\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842524 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run\") pod \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842569 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcv45\" (UniqueName: \"kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45\") pod \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842596 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn\") pod \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842706 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfpzr\" (UniqueName: \"kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr\") pod \"cb05c85b-3440-4c78-b64a-9e950da85ed9\" (UID: \"cb05c85b-3440-4c78-b64a-9e950da85ed9\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842700 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fdbd2ee6-dba1-47e5-b01e-fb168e861e41" (UID: "fdbd2ee6-dba1-47e5-b01e-fb168e861e41"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.842732 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts\") pod \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\" (UID: \"fdbd2ee6-dba1-47e5-b01e-fb168e861e41\") " Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.843424 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fdbd2ee6-dba1-47e5-b01e-fb168e861e41" (UID: "fdbd2ee6-dba1-47e5-b01e-fb168e861e41"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.843449 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run" (OuterVolumeSpecName: "var-run") pod "fdbd2ee6-dba1-47e5-b01e-fb168e861e41" (UID: "fdbd2ee6-dba1-47e5-b01e-fb168e861e41"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.843505 4894 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.844380 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts" (OuterVolumeSpecName: "scripts") pod "fdbd2ee6-dba1-47e5-b01e-fb168e861e41" (UID: "fdbd2ee6-dba1-47e5-b01e-fb168e861e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.845830 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d" (OuterVolumeSpecName: "kube-api-access-lhn5d") pod "5aba867f-c501-4477-bbc6-9d713f2d2b13" (UID: "5aba867f-c501-4477-bbc6-9d713f2d2b13"). InnerVolumeSpecName "kube-api-access-lhn5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.847984 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr" (OuterVolumeSpecName: "kube-api-access-zfpzr") pod "cb05c85b-3440-4c78-b64a-9e950da85ed9" (UID: "cb05c85b-3440-4c78-b64a-9e950da85ed9"). InnerVolumeSpecName "kube-api-access-zfpzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.849419 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45" (OuterVolumeSpecName: "kube-api-access-fcv45") pod "fdbd2ee6-dba1-47e5-b01e-fb168e861e41" (UID: "fdbd2ee6-dba1-47e5-b01e-fb168e861e41"). InnerVolumeSpecName "kube-api-access-fcv45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945126 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcv45\" (UniqueName: \"kubernetes.io/projected/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-kube-api-access-fcv45\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945159 4894 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945171 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfpzr\" (UniqueName: \"kubernetes.io/projected/cb05c85b-3440-4c78-b64a-9e950da85ed9-kube-api-access-zfpzr\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945179 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945189 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhn5d\" (UniqueName: \"kubernetes.io/projected/5aba867f-c501-4477-bbc6-9d713f2d2b13-kube-api-access-lhn5d\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:56 crc kubenswrapper[4894]: I0613 05:04:56.945197 4894 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdbd2ee6-dba1-47e5-b01e-fb168e861e41-var-run\") on node \"crc\" DevicePath \"\"" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.108848 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k5mrc"] Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.156506 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-skf6h"] Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.161573 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4c6w7"] Jun 13 05:04:57 crc kubenswrapper[4894]: W0613 05:04:57.170364 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48af4afd_782d_44df_a045_53a21dc75744.slice/crio-05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759 WatchSource:0}: Error finding container 05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759: Status 404 returned error can't find the container with id 05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759 Jun 13 05:04:57 crc kubenswrapper[4894]: W0613 05:04:57.172841 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab77546b_abcf_47cc_88ab_1d11d45d837d.slice/crio-8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417 WatchSource:0}: Error finding container 8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417: Status 404 returned error can't find the container with id 8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417 Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.285100 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k5mrc" event={"ID":"375c390c-32f8-4a62-84df-ce789ec5a118","Type":"ContainerStarted","Data":"c68fb55eaf89d0855840a18af7462087ad7cf6b530b9b2977c35d5d38cdac064"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.286821 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c6w7" event={"ID":"ab77546b-abcf-47cc-88ab-1d11d45d837d","Type":"ContainerStarted","Data":"8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.288160 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1f1-account-create-xrwwd" event={"ID":"5aba867f-c501-4477-bbc6-9d713f2d2b13","Type":"ContainerDied","Data":"eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.288182 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb3c6bcf7e842aa54fce87b3b297d444a64f5ceca709f8a5d2a01e50aac7ecfa" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.288227 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1f1-account-create-xrwwd" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.290890 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5zm5p-config-mg5zh" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.290882 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5zm5p-config-mg5zh" event={"ID":"fdbd2ee6-dba1-47e5-b01e-fb168e861e41","Type":"ContainerDied","Data":"11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.290999 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a30617f435a8bc2a0ddea6f63d8827c4709e4b3275e2564f7de134393d49a0" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.291689 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-skf6h" event={"ID":"48af4afd-782d-44df-a045-53a21dc75744","Type":"ContainerStarted","Data":"05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.295669 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6e4-account-create-9tvc8" event={"ID":"cb05c85b-3440-4c78-b64a-9e950da85ed9","Type":"ContainerDied","Data":"833aad7e111f47ebb105d03c4158dd35ab09c2ef3bbaea88b4d1226287514fb0"} Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.295692 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="833aad7e111f47ebb105d03c4158dd35ab09c2ef3bbaea88b4d1226287514fb0" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.295722 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6e4-account-create-9tvc8" Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.900849 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5zm5p-config-mg5zh"] Jun 13 05:04:57 crc kubenswrapper[4894]: I0613 05:04:57.908455 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5zm5p-config-mg5zh"] Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.312176 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbd2ee6-dba1-47e5-b01e-fb168e861e41" path="/var/lib/kubelet/pods/fdbd2ee6-dba1-47e5-b01e-fb168e861e41/volumes" Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.317925 4894 generic.go:334] "Generic (PLEG): container finished" podID="ab77546b-abcf-47cc-88ab-1d11d45d837d" containerID="1095cfa46f020d5a1967a1785095cfe5ef6f01d81633d339e2af4faf0ac03a91" exitCode=0 Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.318033 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c6w7" event={"ID":"ab77546b-abcf-47cc-88ab-1d11d45d837d","Type":"ContainerDied","Data":"1095cfa46f020d5a1967a1785095cfe5ef6f01d81633d339e2af4faf0ac03a91"} Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.319949 4894 generic.go:334] "Generic (PLEG): container finished" podID="48af4afd-782d-44df-a045-53a21dc75744" containerID="06d16fa5d1dd95ed64897321891cc340ad55efc6b8917d268cb45a36b8814bb5" exitCode=0 Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.320175 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-skf6h" event={"ID":"48af4afd-782d-44df-a045-53a21dc75744","Type":"ContainerDied","Data":"06d16fa5d1dd95ed64897321891cc340ad55efc6b8917d268cb45a36b8814bb5"} Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.321225 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6wgt" event={"ID":"9908123d-bc70-4017-953a-8f0a082f2726","Type":"ContainerStarted","Data":"f437271ec3faf6391096e86126d0e358c6b2fbd479ff68f74e82797dcec85c12"} Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.322382 4894 generic.go:334] "Generic (PLEG): container finished" podID="375c390c-32f8-4a62-84df-ce789ec5a118" containerID="7d7da288532c226ecceb30f339e777c0228c2294ea36b0db72aabe58eb4a1d49" exitCode=0 Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.322413 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k5mrc" event={"ID":"375c390c-32f8-4a62-84df-ce789ec5a118","Type":"ContainerDied","Data":"7d7da288532c226ecceb30f339e777c0228c2294ea36b0db72aabe58eb4a1d49"} Jun 13 05:04:58 crc kubenswrapper[4894]: I0613 05:04:58.376532 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-t6wgt" podStartSLOduration=3.30991026 podStartE2EDuration="16.376513777s" podCreationTimestamp="2025-06-13 05:04:42 +0000 UTC" firstStartedPulling="2025-06-13 05:04:43.617585634 +0000 UTC m=+842.063833097" lastFinishedPulling="2025-06-13 05:04:56.684189141 +0000 UTC m=+855.130436614" observedRunningTime="2025-06-13 05:04:58.369998614 +0000 UTC m=+856.816246087" watchObservedRunningTime="2025-06-13 05:04:58.376513777 +0000 UTC m=+856.822761240" Jun 13 05:04:59 crc kubenswrapper[4894]: I0613 05:04:59.753325 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c6w7" Jun 13 05:04:59 crc kubenswrapper[4894]: I0613 05:04:59.839439 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k5mrc" Jun 13 05:04:59 crc kubenswrapper[4894]: I0613 05:04:59.855052 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-skf6h" Jun 13 05:04:59 crc kubenswrapper[4894]: I0613 05:04:59.898523 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsnl\" (UniqueName: \"kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl\") pod \"ab77546b-abcf-47cc-88ab-1d11d45d837d\" (UID: \"ab77546b-abcf-47cc-88ab-1d11d45d837d\") " Jun 13 05:04:59 crc kubenswrapper[4894]: I0613 05:04:59.903788 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl" (OuterVolumeSpecName: "kube-api-access-fqsnl") pod "ab77546b-abcf-47cc-88ab-1d11d45d837d" (UID: "ab77546b-abcf-47cc-88ab-1d11d45d837d"). InnerVolumeSpecName "kube-api-access-fqsnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.000301 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kldk\" (UniqueName: \"kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk\") pod \"48af4afd-782d-44df-a045-53a21dc75744\" (UID: \"48af4afd-782d-44df-a045-53a21dc75744\") " Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.000373 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4fjq\" (UniqueName: \"kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq\") pod \"375c390c-32f8-4a62-84df-ce789ec5a118\" (UID: \"375c390c-32f8-4a62-84df-ce789ec5a118\") " Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.001197 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsnl\" (UniqueName: \"kubernetes.io/projected/ab77546b-abcf-47cc-88ab-1d11d45d837d-kube-api-access-fqsnl\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.005690 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq" (OuterVolumeSpecName: "kube-api-access-z4fjq") pod "375c390c-32f8-4a62-84df-ce789ec5a118" (UID: "375c390c-32f8-4a62-84df-ce789ec5a118"). InnerVolumeSpecName "kube-api-access-z4fjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.005768 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk" (OuterVolumeSpecName: "kube-api-access-8kldk") pod "48af4afd-782d-44df-a045-53a21dc75744" (UID: "48af4afd-782d-44df-a045-53a21dc75744"). InnerVolumeSpecName "kube-api-access-8kldk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.102849 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kldk\" (UniqueName: \"kubernetes.io/projected/48af4afd-782d-44df-a045-53a21dc75744-kube-api-access-8kldk\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.102909 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4fjq\" (UniqueName: \"kubernetes.io/projected/375c390c-32f8-4a62-84df-ce789ec5a118-kube-api-access-z4fjq\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.346072 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k5mrc" event={"ID":"375c390c-32f8-4a62-84df-ce789ec5a118","Type":"ContainerDied","Data":"c68fb55eaf89d0855840a18af7462087ad7cf6b530b9b2977c35d5d38cdac064"} Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.346497 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68fb55eaf89d0855840a18af7462087ad7cf6b530b9b2977c35d5d38cdac064" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.346576 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k5mrc" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.352392 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c6w7" event={"ID":"ab77546b-abcf-47cc-88ab-1d11d45d837d","Type":"ContainerDied","Data":"8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417"} Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.352428 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c6w7" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.352451 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da188e9370a807656f041e2624b961772abe66031361ae27514929ee3b30417" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.356184 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-skf6h" event={"ID":"48af4afd-782d-44df-a045-53a21dc75744","Type":"ContainerDied","Data":"05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759"} Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.356237 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a7af759ba732ce4cefc5445807a83b0e465cb187c44c074978302eac295759" Jun 13 05:05:00 crc kubenswrapper[4894]: I0613 05:05:00.356263 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-skf6h" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.037935 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-vwx5j"] Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.038957 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdbd2ee6-dba1-47e5-b01e-fb168e861e41" containerName="ovn-config" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039008 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdbd2ee6-dba1-47e5-b01e-fb168e861e41" containerName="ovn-config" Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.039043 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba867f-c501-4477-bbc6-9d713f2d2b13" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039056 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba867f-c501-4477-bbc6-9d713f2d2b13" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.039079 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48af4afd-782d-44df-a045-53a21dc75744" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039092 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="48af4afd-782d-44df-a045-53a21dc75744" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.039128 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375c390c-32f8-4a62-84df-ce789ec5a118" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039140 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="375c390c-32f8-4a62-84df-ce789ec5a118" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.039163 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb05c85b-3440-4c78-b64a-9e950da85ed9" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039177 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb05c85b-3440-4c78-b64a-9e950da85ed9" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: E0613 05:05:02.039200 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab77546b-abcf-47cc-88ab-1d11d45d837d" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039212 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab77546b-abcf-47cc-88ab-1d11d45d837d" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039474 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aba867f-c501-4477-bbc6-9d713f2d2b13" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039502 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="375c390c-32f8-4a62-84df-ce789ec5a118" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039522 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab77546b-abcf-47cc-88ab-1d11d45d837d" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039550 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdbd2ee6-dba1-47e5-b01e-fb168e861e41" containerName="ovn-config" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039581 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb05c85b-3440-4c78-b64a-9e950da85ed9" containerName="mariadb-account-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.039597 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="48af4afd-782d-44df-a045-53a21dc75744" containerName="mariadb-database-create" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.040467 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.044358 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.141280 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.141391 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n54k\" (UniqueName: \"kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.243503 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.243614 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n54k\" (UniqueName: \"kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.243788 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.263041 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n54k\" (UniqueName: \"kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k\") pod \"crc-debug-vwx5j\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " pod="openstack/crc-debug-vwx5j" Jun 13 05:05:02 crc kubenswrapper[4894]: I0613 05:05:02.373749 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vwx5j" Jun 13 05:05:03 crc kubenswrapper[4894]: I0613 05:05:03.383824 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vwx5j" event={"ID":"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99","Type":"ContainerStarted","Data":"b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a"} Jun 13 05:05:03 crc kubenswrapper[4894]: I0613 05:05:03.384136 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vwx5j" event={"ID":"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99","Type":"ContainerStarted","Data":"0b6fe2e4f45269e785ab269640b138ff5c2217ed8dfa36f9b7391ab4dcfd0534"} Jun 13 05:05:03 crc kubenswrapper[4894]: I0613 05:05:03.386499 4894 generic.go:334] "Generic (PLEG): container finished" podID="9908123d-bc70-4017-953a-8f0a082f2726" containerID="f437271ec3faf6391096e86126d0e358c6b2fbd479ff68f74e82797dcec85c12" exitCode=0 Jun 13 05:05:03 crc kubenswrapper[4894]: I0613 05:05:03.386540 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6wgt" event={"ID":"9908123d-bc70-4017-953a-8f0a082f2726","Type":"ContainerDied","Data":"f437271ec3faf6391096e86126d0e358c6b2fbd479ff68f74e82797dcec85c12"} Jun 13 05:05:03 crc kubenswrapper[4894]: I0613 05:05:03.406905 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-vwx5j" podStartSLOduration=1.406887547 podStartE2EDuration="1.406887547s" podCreationTimestamp="2025-06-13 05:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:03.398514441 +0000 UTC m=+861.844761904" watchObservedRunningTime="2025-06-13 05:05:03.406887547 +0000 UTC m=+861.853135020" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.393991 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-843e-account-create-cr568"] Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.396380 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.402106 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.414155 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-843e-account-create-cr568"] Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.482366 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtwl\" (UniqueName: \"kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl\") pod \"cinder-843e-account-create-cr568\" (UID: \"519e8c61-c11c-42c9-bb32-c4a454724fe1\") " pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.583974 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtwl\" (UniqueName: \"kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl\") pod \"cinder-843e-account-create-cr568\" (UID: \"519e8c61-c11c-42c9-bb32-c4a454724fe1\") " pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.607112 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtwl\" (UniqueName: \"kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl\") pod \"cinder-843e-account-create-cr568\" (UID: \"519e8c61-c11c-42c9-bb32-c4a454724fe1\") " pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.737769 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.862432 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6wgt" Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.988946 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data\") pod \"9908123d-bc70-4017-953a-8f0a082f2726\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.989017 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data\") pod \"9908123d-bc70-4017-953a-8f0a082f2726\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.989066 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmkqj\" (UniqueName: \"kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj\") pod \"9908123d-bc70-4017-953a-8f0a082f2726\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.989184 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle\") pod \"9908123d-bc70-4017-953a-8f0a082f2726\" (UID: \"9908123d-bc70-4017-953a-8f0a082f2726\") " Jun 13 05:05:04 crc kubenswrapper[4894]: I0613 05:05:04.994868 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9908123d-bc70-4017-953a-8f0a082f2726" (UID: "9908123d-bc70-4017-953a-8f0a082f2726"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.008047 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj" (OuterVolumeSpecName: "kube-api-access-vmkqj") pod "9908123d-bc70-4017-953a-8f0a082f2726" (UID: "9908123d-bc70-4017-953a-8f0a082f2726"). InnerVolumeSpecName "kube-api-access-vmkqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.024512 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9908123d-bc70-4017-953a-8f0a082f2726" (UID: "9908123d-bc70-4017-953a-8f0a082f2726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.074130 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data" (OuterVolumeSpecName: "config-data") pod "9908123d-bc70-4017-953a-8f0a082f2726" (UID: "9908123d-bc70-4017-953a-8f0a082f2726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.090797 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.090834 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.090846 4894 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9908123d-bc70-4017-953a-8f0a082f2726-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.090861 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmkqj\" (UniqueName: \"kubernetes.io/projected/9908123d-bc70-4017-953a-8f0a082f2726-kube-api-access-vmkqj\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.165829 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-843e-account-create-cr568"] Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.415364 4894 generic.go:334] "Generic (PLEG): container finished" podID="519e8c61-c11c-42c9-bb32-c4a454724fe1" containerID="ede9096899e17d4a008948fceb45c19061180e59d18826993e4f8b1771c4f7ec" exitCode=0 Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.415936 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-843e-account-create-cr568" event={"ID":"519e8c61-c11c-42c9-bb32-c4a454724fe1","Type":"ContainerDied","Data":"ede9096899e17d4a008948fceb45c19061180e59d18826993e4f8b1771c4f7ec"} Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.416004 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-843e-account-create-cr568" event={"ID":"519e8c61-c11c-42c9-bb32-c4a454724fe1","Type":"ContainerStarted","Data":"b0cca17dea1d5f29bf1d701bb329048916c37da5845ad39a97aea570deb82509"} Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.417761 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t6wgt" event={"ID":"9908123d-bc70-4017-953a-8f0a082f2726","Type":"ContainerDied","Data":"c82e072a6fdd6fd93e265cdc172674367c3cd6775720f01a9632da0cee93605c"} Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.417821 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t6wgt" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.417823 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82e072a6fdd6fd93e265cdc172674367c3cd6775720f01a9632da0cee93605c" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.850035 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:05 crc kubenswrapper[4894]: E0613 05:05:05.850526 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9908123d-bc70-4017-953a-8f0a082f2726" containerName="glance-db-sync" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.850541 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9908123d-bc70-4017-953a-8f0a082f2726" containerName="glance-db-sync" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.850690 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9908123d-bc70-4017-953a-8f0a082f2726" containerName="glance-db-sync" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.851418 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.901221 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.901281 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.901316 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghg7\" (UniqueName: \"kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.901331 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.901514 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:05 crc kubenswrapper[4894]: I0613 05:05:05.913984 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.002385 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.002462 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghg7\" (UniqueName: \"kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.002482 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.002500 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.002571 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.003416 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.003541 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.003684 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.004233 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.024871 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghg7\" (UniqueName: \"kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7\") pod \"dnsmasq-dns-5886589657-tltrd\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.168202 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.647820 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:06 crc kubenswrapper[4894]: W0613 05:05:06.660858 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5add461_1f3d_4ae7_b0a7_967d111be2ba.slice/crio-e5054d0ab955727731cf785ec47a95cc181c217222c6989cd39f2d62123c287c WatchSource:0}: Error finding container e5054d0ab955727731cf785ec47a95cc181c217222c6989cd39f2d62123c287c: Status 404 returned error can't find the container with id e5054d0ab955727731cf785ec47a95cc181c217222c6989cd39f2d62123c287c Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.782934 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.944753 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqtwl\" (UniqueName: \"kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl\") pod \"519e8c61-c11c-42c9-bb32-c4a454724fe1\" (UID: \"519e8c61-c11c-42c9-bb32-c4a454724fe1\") " Jun 13 05:05:06 crc kubenswrapper[4894]: I0613 05:05:06.948697 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl" (OuterVolumeSpecName: "kube-api-access-dqtwl") pod "519e8c61-c11c-42c9-bb32-c4a454724fe1" (UID: "519e8c61-c11c-42c9-bb32-c4a454724fe1"). InnerVolumeSpecName "kube-api-access-dqtwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.046287 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqtwl\" (UniqueName: \"kubernetes.io/projected/519e8c61-c11c-42c9-bb32-c4a454724fe1-kube-api-access-dqtwl\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.162585 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wzjkd"] Jun 13 05:05:07 crc kubenswrapper[4894]: E0613 05:05:07.163019 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519e8c61-c11c-42c9-bb32-c4a454724fe1" containerName="mariadb-account-create" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.163042 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="519e8c61-c11c-42c9-bb32-c4a454724fe1" containerName="mariadb-account-create" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.163247 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="519e8c61-c11c-42c9-bb32-c4a454724fe1" containerName="mariadb-account-create" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.163888 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.167490 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.167892 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.168189 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.174198 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2nhqr" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.182567 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wzjkd"] Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.350498 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.350560 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49sh\" (UniqueName: \"kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.350629 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.433825 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-843e-account-create-cr568" event={"ID":"519e8c61-c11c-42c9-bb32-c4a454724fe1","Type":"ContainerDied","Data":"b0cca17dea1d5f29bf1d701bb329048916c37da5845ad39a97aea570deb82509"} Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.433866 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0cca17dea1d5f29bf1d701bb329048916c37da5845ad39a97aea570deb82509" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.433917 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-843e-account-create-cr568" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.439634 4894 generic.go:334] "Generic (PLEG): container finished" podID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerID="88223d0b3c71ff476b6c4fbd13584d95e406fe76220d89cbe60c9eb8122c65b1" exitCode=0 Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.439696 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5886589657-tltrd" event={"ID":"d5add461-1f3d-4ae7-b0a7-967d111be2ba","Type":"ContainerDied","Data":"88223d0b3c71ff476b6c4fbd13584d95e406fe76220d89cbe60c9eb8122c65b1"} Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.439727 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5886589657-tltrd" event={"ID":"d5add461-1f3d-4ae7-b0a7-967d111be2ba","Type":"ContainerStarted","Data":"e5054d0ab955727731cf785ec47a95cc181c217222c6989cd39f2d62123c287c"} Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.452816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.453041 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49sh\" (UniqueName: \"kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.453227 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.457460 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.460353 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.479617 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49sh\" (UniqueName: \"kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh\") pod \"keystone-db-sync-wzjkd\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:07 crc kubenswrapper[4894]: I0613 05:05:07.780515 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:08 crc kubenswrapper[4894]: I0613 05:05:08.430720 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wzjkd"] Jun 13 05:05:08 crc kubenswrapper[4894]: W0613 05:05:08.432694 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb658e7d_f920_4362_9ae8_149aaca08cda.slice/crio-b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411 WatchSource:0}: Error finding container b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411: Status 404 returned error can't find the container with id b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411 Jun 13 05:05:08 crc kubenswrapper[4894]: I0613 05:05:08.450427 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5886589657-tltrd" event={"ID":"d5add461-1f3d-4ae7-b0a7-967d111be2ba","Type":"ContainerStarted","Data":"33b89d2ba91baca67461085b4659ff7270dfdff8b65de22b436d8a5ac127cafb"} Jun 13 05:05:08 crc kubenswrapper[4894]: I0613 05:05:08.451618 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:08 crc kubenswrapper[4894]: I0613 05:05:08.455777 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wzjkd" event={"ID":"cb658e7d-f920-4362-9ae8-149aaca08cda","Type":"ContainerStarted","Data":"b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411"} Jun 13 05:05:08 crc kubenswrapper[4894]: I0613 05:05:08.477868 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5886589657-tltrd" podStartSLOduration=3.47785025 podStartE2EDuration="3.47785025s" podCreationTimestamp="2025-06-13 05:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:08.469988489 +0000 UTC m=+866.916235962" watchObservedRunningTime="2025-06-13 05:05:08.47785025 +0000 UTC m=+866.924097733" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.054794 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-vwx5j"] Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.055987 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-vwx5j" podUID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" containerName="container-00" containerID="cri-o://b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a" gracePeriod=2 Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.065705 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-vwx5j"] Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.186412 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vwx5j" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.344250 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host\") pod \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.344408 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host" (OuterVolumeSpecName: "host") pod "ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" (UID: "ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.344585 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n54k\" (UniqueName: \"kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k\") pod \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\" (UID: \"ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99\") " Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.345214 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.351246 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k" (OuterVolumeSpecName: "kube-api-access-2n54k") pod "ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" (UID: "ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99"). InnerVolumeSpecName "kube-api-access-2n54k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.446674 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n54k\" (UniqueName: \"kubernetes.io/projected/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99-kube-api-access-2n54k\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.497965 4894 generic.go:334] "Generic (PLEG): container finished" podID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" containerID="b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a" exitCode=0 Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.498038 4894 scope.go:117] "RemoveContainer" containerID="b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.498064 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vwx5j" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.501388 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wzjkd" event={"ID":"cb658e7d-f920-4362-9ae8-149aaca08cda","Type":"ContainerStarted","Data":"5f13fe56c8b83a6fb60747b464bc9f7a699a37277a519bb9eaaa16a1c5540bc5"} Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.534935 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wzjkd" podStartSLOduration=2.020885761 podStartE2EDuration="6.53491265s" podCreationTimestamp="2025-06-13 05:05:07 +0000 UTC" firstStartedPulling="2025-06-13 05:05:08.435367873 +0000 UTC m=+866.881615346" lastFinishedPulling="2025-06-13 05:05:12.949394772 +0000 UTC m=+871.395642235" observedRunningTime="2025-06-13 05:05:13.526875174 +0000 UTC m=+871.973122677" watchObservedRunningTime="2025-06-13 05:05:13.53491265 +0000 UTC m=+871.981160153" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.537523 4894 scope.go:117] "RemoveContainer" containerID="b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a" Jun 13 05:05:13 crc kubenswrapper[4894]: E0613 05:05:13.538059 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a\": container with ID starting with b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a not found: ID does not exist" containerID="b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a" Jun 13 05:05:13 crc kubenswrapper[4894]: I0613 05:05:13.538125 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a"} err="failed to get container status \"b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a\": rpc error: code = NotFound desc = could not find container \"b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a\": container with ID starting with b90056283ab0da1f6ed8eacdddf69e9abfb1793094af7917814e4411ef01999a not found: ID does not exist" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.210926 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8adb-account-create-sxgkf"] Jun 13 05:05:14 crc kubenswrapper[4894]: E0613 05:05:14.211282 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" containerName="container-00" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.211297 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" containerName="container-00" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.211501 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" containerName="container-00" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.212131 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.217818 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.222030 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8adb-account-create-sxgkf"] Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.262305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzq6\" (UniqueName: \"kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6\") pod \"barbican-8adb-account-create-sxgkf\" (UID: \"3ff06e15-dbe8-4864-a039-e30cb2cd88d5\") " pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.288985 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99" path="/var/lib/kubelet/pods/ffe9a196-dc0c-4e69-b047-ea5d0b9c1a99/volumes" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.363781 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzq6\" (UniqueName: \"kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6\") pod \"barbican-8adb-account-create-sxgkf\" (UID: \"3ff06e15-dbe8-4864-a039-e30cb2cd88d5\") " pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.390465 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzq6\" (UniqueName: \"kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6\") pod \"barbican-8adb-account-create-sxgkf\" (UID: \"3ff06e15-dbe8-4864-a039-e30cb2cd88d5\") " pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.399733 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e141-account-create-sgw88"] Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.400864 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.404547 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.422377 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e141-account-create-sgw88"] Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.467112 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lv9\" (UniqueName: \"kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9\") pod \"neutron-e141-account-create-sgw88\" (UID: \"82d173fc-d458-4f68-b2ea-b6b2ed942c5d\") " pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.542462 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.569330 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lv9\" (UniqueName: \"kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9\") pod \"neutron-e141-account-create-sgw88\" (UID: \"82d173fc-d458-4f68-b2ea-b6b2ed942c5d\") " pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.591809 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lv9\" (UniqueName: \"kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9\") pod \"neutron-e141-account-create-sgw88\" (UID: \"82d173fc-d458-4f68-b2ea-b6b2ed942c5d\") " pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:14 crc kubenswrapper[4894]: I0613 05:05:14.746844 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.046056 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8adb-account-create-sxgkf"] Jun 13 05:05:15 crc kubenswrapper[4894]: W0613 05:05:15.049240 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff06e15_dbe8_4864_a039_e30cb2cd88d5.slice/crio-61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d WatchSource:0}: Error finding container 61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d: Status 404 returned error can't find the container with id 61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.193125 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e141-account-create-sgw88"] Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.525012 4894 generic.go:334] "Generic (PLEG): container finished" podID="82d173fc-d458-4f68-b2ea-b6b2ed942c5d" containerID="4d574624fe567cc280930834dc945e3168634c2b396e195f1264c69d10da2cbd" exitCode=0 Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.525343 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e141-account-create-sgw88" event={"ID":"82d173fc-d458-4f68-b2ea-b6b2ed942c5d","Type":"ContainerDied","Data":"4d574624fe567cc280930834dc945e3168634c2b396e195f1264c69d10da2cbd"} Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.525525 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e141-account-create-sgw88" event={"ID":"82d173fc-d458-4f68-b2ea-b6b2ed942c5d","Type":"ContainerStarted","Data":"4db4fa5534d1abb569e50d565497c9d5e6ff5d363a862538dc945f9e73c94a4c"} Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.529047 4894 generic.go:334] "Generic (PLEG): container finished" podID="3ff06e15-dbe8-4864-a039-e30cb2cd88d5" containerID="7f669537e496e33923176d1259843065243d7a8916ad38c5c3362ab6dc51afbc" exitCode=0 Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.529101 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8adb-account-create-sxgkf" event={"ID":"3ff06e15-dbe8-4864-a039-e30cb2cd88d5","Type":"ContainerDied","Data":"7f669537e496e33923176d1259843065243d7a8916ad38c5c3362ab6dc51afbc"} Jun 13 05:05:15 crc kubenswrapper[4894]: I0613 05:05:15.529134 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8adb-account-create-sxgkf" event={"ID":"3ff06e15-dbe8-4864-a039-e30cb2cd88d5","Type":"ContainerStarted","Data":"61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d"} Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.171761 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.236812 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.237157 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="dnsmasq-dns" containerID="cri-o://bcfe7bf751fbc4cc225a82fc48252796d7312c544975658459b9964aecf1b2a2" gracePeriod=10 Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.350674 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.541091 4894 generic.go:334] "Generic (PLEG): container finished" podID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerID="bcfe7bf751fbc4cc225a82fc48252796d7312c544975658459b9964aecf1b2a2" exitCode=0 Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.541185 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" event={"ID":"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4","Type":"ContainerDied","Data":"bcfe7bf751fbc4cc225a82fc48252796d7312c544975658459b9964aecf1b2a2"} Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.543849 4894 generic.go:334] "Generic (PLEG): container finished" podID="cb658e7d-f920-4362-9ae8-149aaca08cda" containerID="5f13fe56c8b83a6fb60747b464bc9f7a699a37277a519bb9eaaa16a1c5540bc5" exitCode=0 Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.544040 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wzjkd" event={"ID":"cb658e7d-f920-4362-9ae8-149aaca08cda","Type":"ContainerDied","Data":"5f13fe56c8b83a6fb60747b464bc9f7a699a37277a519bb9eaaa16a1c5540bc5"} Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.681600 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.810144 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb\") pod \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.810191 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb\") pod \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.810240 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc\") pod \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.810292 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config\") pod \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.810317 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx7t5\" (UniqueName: \"kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5\") pod \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\" (UID: \"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4\") " Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.823913 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5" (OuterVolumeSpecName: "kube-api-access-wx7t5") pod "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" (UID: "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4"). InnerVolumeSpecName "kube-api-access-wx7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.852904 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config" (OuterVolumeSpecName: "config") pod "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" (UID: "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.862748 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" (UID: "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.865646 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" (UID: "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.870400 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" (UID: "0ab02aed-8c68-42ce-8b7b-31ddc678cdc4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.878574 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.883081 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.912575 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.912822 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.912882 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.912952 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:16 crc kubenswrapper[4894]: I0613 05:05:16.913017 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx7t5\" (UniqueName: \"kubernetes.io/projected/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4-kube-api-access-wx7t5\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.013831 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzq6\" (UniqueName: \"kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6\") pod \"3ff06e15-dbe8-4864-a039-e30cb2cd88d5\" (UID: \"3ff06e15-dbe8-4864-a039-e30cb2cd88d5\") " Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.013903 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5lv9\" (UniqueName: \"kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9\") pod \"82d173fc-d458-4f68-b2ea-b6b2ed942c5d\" (UID: \"82d173fc-d458-4f68-b2ea-b6b2ed942c5d\") " Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.016736 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6" (OuterVolumeSpecName: "kube-api-access-4xzq6") pod "3ff06e15-dbe8-4864-a039-e30cb2cd88d5" (UID: "3ff06e15-dbe8-4864-a039-e30cb2cd88d5"). InnerVolumeSpecName "kube-api-access-4xzq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.017186 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9" (OuterVolumeSpecName: "kube-api-access-s5lv9") pod "82d173fc-d458-4f68-b2ea-b6b2ed942c5d" (UID: "82d173fc-d458-4f68-b2ea-b6b2ed942c5d"). InnerVolumeSpecName "kube-api-access-s5lv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.115242 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzq6\" (UniqueName: \"kubernetes.io/projected/3ff06e15-dbe8-4864-a039-e30cb2cd88d5-kube-api-access-4xzq6\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.115270 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5lv9\" (UniqueName: \"kubernetes.io/projected/82d173fc-d458-4f68-b2ea-b6b2ed942c5d-kube-api-access-s5lv9\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.555938 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e141-account-create-sgw88" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.555947 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e141-account-create-sgw88" event={"ID":"82d173fc-d458-4f68-b2ea-b6b2ed942c5d","Type":"ContainerDied","Data":"4db4fa5534d1abb569e50d565497c9d5e6ff5d363a862538dc945f9e73c94a4c"} Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.556096 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db4fa5534d1abb569e50d565497c9d5e6ff5d363a862538dc945f9e73c94a4c" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.558884 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8adb-account-create-sxgkf" event={"ID":"3ff06e15-dbe8-4864-a039-e30cb2cd88d5","Type":"ContainerDied","Data":"61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d"} Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.558978 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61592914d63f95b6e5599a5d26776d0b54f1d3e62c6a96f828b0992664d3b11d" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.559040 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8adb-account-create-sxgkf" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.568265 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.571023 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b7d5b7f-qbgzj" event={"ID":"0ab02aed-8c68-42ce-8b7b-31ddc678cdc4","Type":"ContainerDied","Data":"c3394be169d66dd9b37e8d8bc11c5f5e68a155048ce73dd1fbee5f861d00aab8"} Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.571122 4894 scope.go:117] "RemoveContainer" containerID="bcfe7bf751fbc4cc225a82fc48252796d7312c544975658459b9964aecf1b2a2" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.614248 4894 scope.go:117] "RemoveContainer" containerID="18d2a89e3e926ba407104219d31d641cfde7691e6293b2657d4398ec1ea41d09" Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.689370 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:05:17 crc kubenswrapper[4894]: I0613 05:05:17.701089 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644b7d5b7f-qbgzj"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:17.929141 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:17.941417 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle\") pod \"cb658e7d-f920-4362-9ae8-149aaca08cda\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.025027 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb658e7d-f920-4362-9ae8-149aaca08cda" (UID: "cb658e7d-f920-4362-9ae8-149aaca08cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.043254 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p49sh\" (UniqueName: \"kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh\") pod \"cb658e7d-f920-4362-9ae8-149aaca08cda\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.043379 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data\") pod \"cb658e7d-f920-4362-9ae8-149aaca08cda\" (UID: \"cb658e7d-f920-4362-9ae8-149aaca08cda\") " Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.043762 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.046317 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh" (OuterVolumeSpecName: "kube-api-access-p49sh") pod "cb658e7d-f920-4362-9ae8-149aaca08cda" (UID: "cb658e7d-f920-4362-9ae8-149aaca08cda"). InnerVolumeSpecName "kube-api-access-p49sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.087205 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data" (OuterVolumeSpecName: "config-data") pod "cb658e7d-f920-4362-9ae8-149aaca08cda" (UID: "cb658e7d-f920-4362-9ae8-149aaca08cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.145564 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb658e7d-f920-4362-9ae8-149aaca08cda-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.145598 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p49sh\" (UniqueName: \"kubernetes.io/projected/cb658e7d-f920-4362-9ae8-149aaca08cda-kube-api-access-p49sh\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.289163 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" path="/var/lib/kubelet/pods/0ab02aed-8c68-42ce-8b7b-31ddc678cdc4/volumes" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.579683 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wzjkd" event={"ID":"cb658e7d-f920-4362-9ae8-149aaca08cda","Type":"ContainerDied","Data":"b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411"} Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.580361 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b423e3808dca48a3af25a421698096f60428eea38eea2b2b8e7d13ac03d78411" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.579677 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wzjkd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.838798 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:19 crc kubenswrapper[4894]: E0613 05:05:18.839323 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb658e7d-f920-4362-9ae8-149aaca08cda" containerName="keystone-db-sync" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839335 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb658e7d-f920-4362-9ae8-149aaca08cda" containerName="keystone-db-sync" Jun 13 05:05:19 crc kubenswrapper[4894]: E0613 05:05:18.839352 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d173fc-d458-4f68-b2ea-b6b2ed942c5d" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839358 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d173fc-d458-4f68-b2ea-b6b2ed942c5d" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: E0613 05:05:18.839367 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="init" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839373 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="init" Jun 13 05:05:19 crc kubenswrapper[4894]: E0613 05:05:18.839383 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff06e15-dbe8-4864-a039-e30cb2cd88d5" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839390 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff06e15-dbe8-4864-a039-e30cb2cd88d5" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: E0613 05:05:18.839400 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="dnsmasq-dns" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839407 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="dnsmasq-dns" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839583 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d173fc-d458-4f68-b2ea-b6b2ed942c5d" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839595 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb658e7d-f920-4362-9ae8-149aaca08cda" containerName="keystone-db-sync" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839604 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab02aed-8c68-42ce-8b7b-31ddc678cdc4" containerName="dnsmasq-dns" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.839614 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff06e15-dbe8-4864-a039-e30cb2cd88d5" containerName="mariadb-account-create" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.840380 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.855821 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.887526 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-84mx2"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.889008 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.892669 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.893004 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.893161 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.893430 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2nhqr" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.911990 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-84mx2"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.957316 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8jw\" (UniqueName: \"kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.957374 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.957400 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.957617 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:18.957838 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.059250 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.059287 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.059318 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.059876 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060037 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060155 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8jw\" (UniqueName: \"kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060202 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060238 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060279 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxhf\" (UniqueName: \"kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060327 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060364 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060519 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.060993 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.061089 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.061203 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.100612 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8jw\" (UniqueName: \"kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw\") pod \"dnsmasq-dns-76f7f4785c-r26pk\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.160305 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161228 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161301 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxhf\" (UniqueName: \"kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161330 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161378 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161393 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.161417 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.164417 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.165392 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.166999 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.167870 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.174255 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.181013 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxhf\" (UniqueName: \"kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf\") pod \"keystone-bootstrap-84mx2\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.213124 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.350736 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-czlhd"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.355264 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.360186 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bwdvs" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.360359 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.360550 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.366412 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-czlhd"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434497 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434841 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbkw\" (UniqueName: \"kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434875 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434892 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434975 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.434999 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.435069 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.442805 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cgk85"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.445137 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.453715 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cgk85"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.464887 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8d74c" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.465052 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.465550 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.478804 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.505932 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.507314 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536353 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536399 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536431 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536480 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536502 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536517 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536557 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnj2x\" (UniqueName: \"kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536597 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536628 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536647 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536686 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.536725 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbkw\" (UniqueName: \"kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.540254 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.544055 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.549775 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.553464 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.554295 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gq5rk"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.563158 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.564073 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.564533 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.567807 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.568348 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbkw\" (UniqueName: \"kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw\") pod \"cinder-db-sync-czlhd\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.569388 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.569489 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rfw5b" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.581305 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gq5rk"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638437 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtntp\" (UniqueName: \"kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638496 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638525 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638544 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638565 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638585 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638609 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5rk\" (UniqueName: \"kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638628 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638665 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638701 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638734 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638812 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.638919 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnj2x\" (UniqueName: \"kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.639043 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.655574 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.662513 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.663338 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.665765 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnj2x\" (UniqueName: \"kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x\") pod \"placement-db-sync-cgk85\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.677791 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7xc7g"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.678947 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.680792 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.680924 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.682669 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q7jk8" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.687649 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-czlhd" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.703764 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7xc7g"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744007 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744111 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtntp\" (UniqueName: \"kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744155 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744176 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744197 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744239 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5rk\" (UniqueName: \"kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744258 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.744295 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.746309 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.748707 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.750586 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.750676 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.759732 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.762647 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.764715 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5rk\" (UniqueName: \"kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk\") pod \"barbican-db-sync-gq5rk\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.765358 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtntp\" (UniqueName: \"kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp\") pod \"dnsmasq-dns-5d95cb594c-jd78r\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.812670 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.835474 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.847216 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2zx\" (UniqueName: \"kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.847270 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.847311 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.893417 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.949310 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.951166 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2zx\" (UniqueName: \"kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.951220 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.955028 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.955379 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.958868 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.964932 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-84mx2"] Jun 13 05:05:19 crc kubenswrapper[4894]: I0613 05:05:19.983449 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2zx\" (UniqueName: \"kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx\") pod \"neutron-db-sync-7xc7g\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.082856 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.209539 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-czlhd"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.295083 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.297130 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.299503 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.299893 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.300247 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:05:20 crc kubenswrapper[4894]: W0613 05:05:20.417973 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c26ac92_294c_4929_a955_c6184b543538.slice/crio-c0bd2f1c0dab86cbc5fe4e7b5e1fe41297c80ee729061472b3dd1552600be034 WatchSource:0}: Error finding container c0bd2f1c0dab86cbc5fe4e7b5e1fe41297c80ee729061472b3dd1552600be034: Status 404 returned error can't find the container with id c0bd2f1c0dab86cbc5fe4e7b5e1fe41297c80ee729061472b3dd1552600be034 Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.431223 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.475963 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476005 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476048 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476089 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fpsv\" (UniqueName: \"kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476113 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476141 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.476182 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.500026 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gq5rk"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581174 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fpsv\" (UniqueName: \"kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581218 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581244 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581287 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581320 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581339 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.581371 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.582366 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.582790 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.593541 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.594669 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.596955 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.614172 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cgk85"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.619117 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.622301 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fpsv\" (UniqueName: \"kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv\") pod \"ceilometer-0\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.632066 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.703162 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7xc7g"] Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.736038 4894 generic.go:334] "Generic (PLEG): container finished" podID="c5889b61-b46a-4efa-8d39-1700292a051f" containerID="4a59898137313043589af15c0c64f22e126d40804c562a9d1852c9697bed8d4b" exitCode=0 Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.736300 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" event={"ID":"c5889b61-b46a-4efa-8d39-1700292a051f","Type":"ContainerDied","Data":"4a59898137313043589af15c0c64f22e126d40804c562a9d1852c9697bed8d4b"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.736326 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" event={"ID":"c5889b61-b46a-4efa-8d39-1700292a051f","Type":"ContainerStarted","Data":"6fed0dcf7fee4ccefd500e2ef431dccec249f561c6d41ba9a06b97a68d862d13"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.764886 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-84mx2" event={"ID":"eb4f6c13-8859-4205-a5c3-175955975050","Type":"ContainerStarted","Data":"e69437542148f0d36daf735f854c50a5c721f0064f55e0a73fd7af808aa6c24d"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.764946 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-84mx2" event={"ID":"eb4f6c13-8859-4205-a5c3-175955975050","Type":"ContainerStarted","Data":"f57c405a1f841dfce24b60a37f45cf8918a38e64c5ed1e9ae7262e07f6d91ed3"} Jun 13 05:05:20 crc kubenswrapper[4894]: W0613 05:05:20.766212 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab9044c_7402_497f_9496_c6ed4aaaa76c.slice/crio-0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db WatchSource:0}: Error finding container 0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db: Status 404 returned error can't find the container with id 0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.795676 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gq5rk" event={"ID":"fcfc55e9-b62b-4d38-8e72-4cf04ba09524","Type":"ContainerStarted","Data":"0ca194a4cee2a86a4538224c773c32065828d8667d949978eb1356233ef1f9d2"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.803473 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-84mx2" podStartSLOduration=2.803457979 podStartE2EDuration="2.803457979s" podCreationTimestamp="2025-06-13 05:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:20.796021249 +0000 UTC m=+879.242268722" watchObservedRunningTime="2025-06-13 05:05:20.803457979 +0000 UTC m=+879.249705442" Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.810771 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-czlhd" event={"ID":"e8402629-c5ed-4482-9a1c-bdf5caaa2a21","Type":"ContainerStarted","Data":"e29e9087928322668705115ae8633e2ef60e12262342c2cf0793a08eef6417c5"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.814047 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" event={"ID":"7c26ac92-294c-4929-a955-c6184b543538","Type":"ContainerStarted","Data":"c0bd2f1c0dab86cbc5fe4e7b5e1fe41297c80ee729061472b3dd1552600be034"} Jun 13 05:05:20 crc kubenswrapper[4894]: I0613 05:05:20.820159 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgk85" event={"ID":"6bcc604a-93b6-4aca-bbca-0b078378889d","Type":"ContainerStarted","Data":"b961826eb3b6d2699ec12179eb294cf2464201a10814fc7e9fa678d8677802f1"} Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.152173 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:05:21 crc kubenswrapper[4894]: W0613 05:05:21.167804 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b49d780_448b_4d39_be8c_4d711a72c12f.slice/crio-9c65826b6ad04e652bd9aeb884e7a0dd769ad4c2aa8f6783e29f0ff0c0be6cc7 WatchSource:0}: Error finding container 9c65826b6ad04e652bd9aeb884e7a0dd769ad4c2aa8f6783e29f0ff0c0be6cc7: Status 404 returned error can't find the container with id 9c65826b6ad04e652bd9aeb884e7a0dd769ad4c2aa8f6783e29f0ff0c0be6cc7 Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.822182 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.855967 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7xc7g" event={"ID":"7ab9044c-7402-497f-9496-c6ed4aaaa76c","Type":"ContainerStarted","Data":"e1eff5b7391ce617f84516910d42e0cb988c0199d1d6233bdd70f7c622e98682"} Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.856006 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7xc7g" event={"ID":"7ab9044c-7402-497f-9496-c6ed4aaaa76c","Type":"ContainerStarted","Data":"0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db"} Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.864868 4894 generic.go:334] "Generic (PLEG): container finished" podID="7c26ac92-294c-4929-a955-c6184b543538" containerID="036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5" exitCode=0 Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.864926 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" event={"ID":"7c26ac92-294c-4929-a955-c6184b543538","Type":"ContainerDied","Data":"036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5"} Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.873357 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerStarted","Data":"9c65826b6ad04e652bd9aeb884e7a0dd769ad4c2aa8f6783e29f0ff0c0be6cc7"} Jun 13 05:05:21 crc kubenswrapper[4894]: I0613 05:05:21.883303 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7xc7g" podStartSLOduration=2.883289468 podStartE2EDuration="2.883289468s" podCreationTimestamp="2025-06-13 05:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:21.877137605 +0000 UTC m=+880.323385068" watchObservedRunningTime="2025-06-13 05:05:21.883289468 +0000 UTC m=+880.329536931" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.249041 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.333245 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc\") pod \"c5889b61-b46a-4efa-8d39-1700292a051f\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.333287 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config\") pod \"c5889b61-b46a-4efa-8d39-1700292a051f\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.333325 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v8jw\" (UniqueName: \"kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw\") pod \"c5889b61-b46a-4efa-8d39-1700292a051f\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.333373 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb\") pod \"c5889b61-b46a-4efa-8d39-1700292a051f\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.333533 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb\") pod \"c5889b61-b46a-4efa-8d39-1700292a051f\" (UID: \"c5889b61-b46a-4efa-8d39-1700292a051f\") " Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.342116 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw" (OuterVolumeSpecName: "kube-api-access-7v8jw") pod "c5889b61-b46a-4efa-8d39-1700292a051f" (UID: "c5889b61-b46a-4efa-8d39-1700292a051f"). InnerVolumeSpecName "kube-api-access-7v8jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.369388 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5889b61-b46a-4efa-8d39-1700292a051f" (UID: "c5889b61-b46a-4efa-8d39-1700292a051f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.380548 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5889b61-b46a-4efa-8d39-1700292a051f" (UID: "c5889b61-b46a-4efa-8d39-1700292a051f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.395214 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5889b61-b46a-4efa-8d39-1700292a051f" (UID: "c5889b61-b46a-4efa-8d39-1700292a051f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.407698 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config" (OuterVolumeSpecName: "config") pod "c5889b61-b46a-4efa-8d39-1700292a051f" (UID: "c5889b61-b46a-4efa-8d39-1700292a051f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.436147 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.436179 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.436188 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.436202 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v8jw\" (UniqueName: \"kubernetes.io/projected/c5889b61-b46a-4efa-8d39-1700292a051f-kube-api-access-7v8jw\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.436212 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5889b61-b46a-4efa-8d39-1700292a051f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.888460 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" event={"ID":"7c26ac92-294c-4929-a955-c6184b543538","Type":"ContainerStarted","Data":"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268"} Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.888616 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.910635 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.911212 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f7f4785c-r26pk" event={"ID":"c5889b61-b46a-4efa-8d39-1700292a051f","Type":"ContainerDied","Data":"6fed0dcf7fee4ccefd500e2ef431dccec249f561c6d41ba9a06b97a68d862d13"} Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.911280 4894 scope.go:117] "RemoveContainer" containerID="4a59898137313043589af15c0c64f22e126d40804c562a9d1852c9697bed8d4b" Jun 13 05:05:22 crc kubenswrapper[4894]: I0613 05:05:22.916535 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" podStartSLOduration=3.916519625 podStartE2EDuration="3.916519625s" podCreationTimestamp="2025-06-13 05:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:22.909306382 +0000 UTC m=+881.355553845" watchObservedRunningTime="2025-06-13 05:05:22.916519625 +0000 UTC m=+881.362767088" Jun 13 05:05:23 crc kubenswrapper[4894]: I0613 05:05:23.030075 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:23 crc kubenswrapper[4894]: I0613 05:05:23.037640 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f7f4785c-r26pk"] Jun 13 05:05:24 crc kubenswrapper[4894]: I0613 05:05:24.289512 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5889b61-b46a-4efa-8d39-1700292a051f" path="/var/lib/kubelet/pods/c5889b61-b46a-4efa-8d39-1700292a051f/volumes" Jun 13 05:05:24 crc kubenswrapper[4894]: I0613 05:05:24.938326 4894 generic.go:334] "Generic (PLEG): container finished" podID="eb4f6c13-8859-4205-a5c3-175955975050" containerID="e69437542148f0d36daf735f854c50a5c721f0064f55e0a73fd7af808aa6c24d" exitCode=0 Jun 13 05:05:24 crc kubenswrapper[4894]: I0613 05:05:24.938535 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-84mx2" event={"ID":"eb4f6c13-8859-4205-a5c3-175955975050","Type":"ContainerDied","Data":"e69437542148f0d36daf735f854c50a5c721f0064f55e0a73fd7af808aa6c24d"} Jun 13 05:05:26 crc kubenswrapper[4894]: I0613 05:05:26.236441 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:05:26 crc kubenswrapper[4894]: I0613 05:05:26.236753 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.156728 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242243 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxhf\" (UniqueName: \"kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242293 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242373 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242393 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242460 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.242525 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys\") pod \"eb4f6c13-8859-4205-a5c3-175955975050\" (UID: \"eb4f6c13-8859-4205-a5c3-175955975050\") " Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.248799 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf" (OuterVolumeSpecName: "kube-api-access-5lxhf") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "kube-api-access-5lxhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.250028 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts" (OuterVolumeSpecName: "scripts") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.250108 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.250542 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.278417 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.278971 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data" (OuterVolumeSpecName: "config-data") pod "eb4f6c13-8859-4205-a5c3-175955975050" (UID: "eb4f6c13-8859-4205-a5c3-175955975050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344087 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344116 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344128 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344142 4894 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-credential-keys\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344154 4894 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb4f6c13-8859-4205-a5c3-175955975050-fernet-keys\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.344165 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxhf\" (UniqueName: \"kubernetes.io/projected/eb4f6c13-8859-4205-a5c3-175955975050-kube-api-access-5lxhf\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.964400 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-84mx2" event={"ID":"eb4f6c13-8859-4205-a5c3-175955975050","Type":"ContainerDied","Data":"f57c405a1f841dfce24b60a37f45cf8918a38e64c5ed1e9ae7262e07f6d91ed3"} Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.964444 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57c405a1f841dfce24b60a37f45cf8918a38e64c5ed1e9ae7262e07f6d91ed3" Jun 13 05:05:27 crc kubenswrapper[4894]: I0613 05:05:27.964505 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-84mx2" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.238922 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-84mx2"] Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.243683 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-84mx2"] Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.293541 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4f6c13-8859-4205-a5c3-175955975050" path="/var/lib/kubelet/pods/eb4f6c13-8859-4205-a5c3-175955975050/volumes" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.335681 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qhq8x"] Jun 13 05:05:28 crc kubenswrapper[4894]: E0613 05:05:28.335985 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f6c13-8859-4205-a5c3-175955975050" containerName="keystone-bootstrap" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.336002 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f6c13-8859-4205-a5c3-175955975050" containerName="keystone-bootstrap" Jun 13 05:05:28 crc kubenswrapper[4894]: E0613 05:05:28.336017 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5889b61-b46a-4efa-8d39-1700292a051f" containerName="init" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.336023 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5889b61-b46a-4efa-8d39-1700292a051f" containerName="init" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.336293 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5889b61-b46a-4efa-8d39-1700292a051f" containerName="init" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.336320 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f6c13-8859-4205-a5c3-175955975050" containerName="keystone-bootstrap" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.338509 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.342830 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qhq8x"] Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.343181 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.343267 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2nhqr" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.343388 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.343470 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.464963 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.465004 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.465043 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzbpg\" (UniqueName: \"kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.465078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.465623 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.465710 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569126 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569482 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569518 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569536 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569570 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzbpg\" (UniqueName: \"kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.569600 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.577493 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.581095 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.581405 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.581734 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.587746 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.590171 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzbpg\" (UniqueName: \"kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg\") pod \"keystone-bootstrap-qhq8x\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:28 crc kubenswrapper[4894]: I0613 05:05:28.664745 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:29 crc kubenswrapper[4894]: I0613 05:05:29.836595 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:29 crc kubenswrapper[4894]: I0613 05:05:29.903171 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:29 crc kubenswrapper[4894]: I0613 05:05:29.903406 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5886589657-tltrd" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" containerID="cri-o://33b89d2ba91baca67461085b4659ff7270dfdff8b65de22b436d8a5ac127cafb" gracePeriod=10 Jun 13 05:05:30 crc kubenswrapper[4894]: I0613 05:05:30.989162 4894 generic.go:334] "Generic (PLEG): container finished" podID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerID="33b89d2ba91baca67461085b4659ff7270dfdff8b65de22b436d8a5ac127cafb" exitCode=0 Jun 13 05:05:30 crc kubenswrapper[4894]: I0613 05:05:30.989356 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5886589657-tltrd" event={"ID":"d5add461-1f3d-4ae7-b0a7-967d111be2ba","Type":"ContainerDied","Data":"33b89d2ba91baca67461085b4659ff7270dfdff8b65de22b436d8a5ac127cafb"} Jun 13 05:05:31 crc kubenswrapper[4894]: I0613 05:05:31.169794 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5886589657-tltrd" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Jun 13 05:05:38 crc kubenswrapper[4894]: I0613 05:05:38.044684 4894 generic.go:334] "Generic (PLEG): container finished" podID="7ab9044c-7402-497f-9496-c6ed4aaaa76c" containerID="e1eff5b7391ce617f84516910d42e0cb988c0199d1d6233bdd70f7c622e98682" exitCode=0 Jun 13 05:05:38 crc kubenswrapper[4894]: I0613 05:05:38.044760 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7xc7g" event={"ID":"7ab9044c-7402-497f-9496-c6ed4aaaa76c","Type":"ContainerDied","Data":"e1eff5b7391ce617f84516910d42e0cb988c0199d1d6233bdd70f7c622e98682"} Jun 13 05:05:39 crc kubenswrapper[4894]: E0613 05:05:39.308531 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jun 13 05:05:39 crc kubenswrapper[4894]: E0613 05:05:39.308952 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7q5rk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gq5rk_openstack(fcfc55e9-b62b-4d38-8e72-4cf04ba09524): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:05:39 crc kubenswrapper[4894]: E0613 05:05:39.310471 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gq5rk" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.371152 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.498325 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jghg7\" (UniqueName: \"kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7\") pod \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.498934 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb\") pod \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.498975 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb\") pod \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.499131 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config\") pod \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.499187 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc\") pod \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\" (UID: \"d5add461-1f3d-4ae7-b0a7-967d111be2ba\") " Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.509798 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7" (OuterVolumeSpecName: "kube-api-access-jghg7") pod "d5add461-1f3d-4ae7-b0a7-967d111be2ba" (UID: "d5add461-1f3d-4ae7-b0a7-967d111be2ba"). InnerVolumeSpecName "kube-api-access-jghg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.544791 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config" (OuterVolumeSpecName: "config") pod "d5add461-1f3d-4ae7-b0a7-967d111be2ba" (UID: "d5add461-1f3d-4ae7-b0a7-967d111be2ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.550597 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5add461-1f3d-4ae7-b0a7-967d111be2ba" (UID: "d5add461-1f3d-4ae7-b0a7-967d111be2ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.553853 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5add461-1f3d-4ae7-b0a7-967d111be2ba" (UID: "d5add461-1f3d-4ae7-b0a7-967d111be2ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.575488 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5add461-1f3d-4ae7-b0a7-967d111be2ba" (UID: "d5add461-1f3d-4ae7-b0a7-967d111be2ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.601055 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.601085 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.601098 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jghg7\" (UniqueName: \"kubernetes.io/projected/d5add461-1f3d-4ae7-b0a7-967d111be2ba-kube-api-access-jghg7\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.601111 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:39 crc kubenswrapper[4894]: I0613 05:05:39.601121 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5add461-1f3d-4ae7-b0a7-967d111be2ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.067696 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5886589657-tltrd" event={"ID":"d5add461-1f3d-4ae7-b0a7-967d111be2ba","Type":"ContainerDied","Data":"e5054d0ab955727731cf785ec47a95cc181c217222c6989cd39f2d62123c287c"} Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.067765 4894 scope.go:117] "RemoveContainer" containerID="33b89d2ba91baca67461085b4659ff7270dfdff8b65de22b436d8a5ac127cafb" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.067712 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5886589657-tltrd" Jun 13 05:05:40 crc kubenswrapper[4894]: E0613 05:05:40.079206 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gq5rk" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.118472 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.124471 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5886589657-tltrd"] Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.284921 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" path="/var/lib/kubelet/pods/d5add461-1f3d-4ae7-b0a7-967d111be2ba/volumes" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.566136 4894 scope.go:117] "RemoveContainer" containerID="88223d0b3c71ff476b6c4fbd13584d95e406fe76220d89cbe60c9eb8122c65b1" Jun 13 05:05:40 crc kubenswrapper[4894]: E0613 05:05:40.613361 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jun 13 05:05:40 crc kubenswrapper[4894]: E0613 05:05:40.613551 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-localtime,ReadOnly:true,MountPath:/etc/localtime,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkbkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-czlhd_openstack(e8402629-c5ed-4482-9a1c-bdf5caaa2a21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:05:40 crc kubenswrapper[4894]: E0613 05:05:40.615949 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-czlhd" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.803476 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.922153 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config\") pod \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.922390 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js2zx\" (UniqueName: \"kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx\") pod \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.922504 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle\") pod \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\" (UID: \"7ab9044c-7402-497f-9496-c6ed4aaaa76c\") " Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.926980 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx" (OuterVolumeSpecName: "kube-api-access-js2zx") pod "7ab9044c-7402-497f-9496-c6ed4aaaa76c" (UID: "7ab9044c-7402-497f-9496-c6ed4aaaa76c"). InnerVolumeSpecName "kube-api-access-js2zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.944151 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ab9044c-7402-497f-9496-c6ed4aaaa76c" (UID: "7ab9044c-7402-497f-9496-c6ed4aaaa76c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.950932 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config" (OuterVolumeSpecName: "config") pod "7ab9044c-7402-497f-9496-c6ed4aaaa76c" (UID: "7ab9044c-7402-497f-9496-c6ed4aaaa76c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:40 crc kubenswrapper[4894]: I0613 05:05:40.993065 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qhq8x"] Jun 13 05:05:40 crc kubenswrapper[4894]: W0613 05:05:40.998047 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a28689a_17c0_44ae_b07d_4b23fd1ce70a.slice/crio-902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad WatchSource:0}: Error finding container 902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad: Status 404 returned error can't find the container with id 902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.023837 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js2zx\" (UniqueName: \"kubernetes.io/projected/7ab9044c-7402-497f-9496-c6ed4aaaa76c-kube-api-access-js2zx\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.023959 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.024039 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ab9044c-7402-497f-9496-c6ed4aaaa76c-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.077036 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qhq8x" event={"ID":"1a28689a-17c0-44ae-b07d-4b23fd1ce70a","Type":"ContainerStarted","Data":"902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad"} Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.078680 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerStarted","Data":"57af36af8daa20b8e5ce925628216b15eeaf4095f1f2f01f2a4ff2aaa60b61ed"} Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.082787 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgk85" event={"ID":"6bcc604a-93b6-4aca-bbca-0b078378889d","Type":"ContainerStarted","Data":"4de4772a609b7651df82f3f443c5f6187453e7fc6f7e9559ee485579979ba94b"} Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.085232 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7xc7g" event={"ID":"7ab9044c-7402-497f-9496-c6ed4aaaa76c","Type":"ContainerDied","Data":"0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db"} Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.085254 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7xc7g" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.085272 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb60d31c3f5a3ab3d2c5cc3689252150a8905c88f3d4d8217793e0cb5a8c9db" Jun 13 05:05:41 crc kubenswrapper[4894]: E0613 05:05:41.088926 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-czlhd" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.102155 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cgk85" podStartSLOduration=2.293970992 podStartE2EDuration="22.102137897s" podCreationTimestamp="2025-06-13 05:05:19 +0000 UTC" firstStartedPulling="2025-06-13 05:05:20.734192808 +0000 UTC m=+879.180440271" lastFinishedPulling="2025-06-13 05:05:40.542359683 +0000 UTC m=+898.988607176" observedRunningTime="2025-06-13 05:05:41.097918208 +0000 UTC m=+899.544165671" watchObservedRunningTime="2025-06-13 05:05:41.102137897 +0000 UTC m=+899.548385360" Jun 13 05:05:41 crc kubenswrapper[4894]: I0613 05:05:41.169807 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5886589657-tltrd" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057092 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:05:42 crc kubenswrapper[4894]: E0613 05:05:42.057378 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="init" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057388 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="init" Jun 13 05:05:42 crc kubenswrapper[4894]: E0613 05:05:42.057395 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057410 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" Jun 13 05:05:42 crc kubenswrapper[4894]: E0613 05:05:42.057420 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab9044c-7402-497f-9496-c6ed4aaaa76c" containerName="neutron-db-sync" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057425 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab9044c-7402-497f-9496-c6ed4aaaa76c" containerName="neutron-db-sync" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057575 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5add461-1f3d-4ae7-b0a7-967d111be2ba" containerName="dnsmasq-dns" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.057589 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab9044c-7402-497f-9496-c6ed4aaaa76c" containerName="neutron-db-sync" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.060225 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.074862 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.111891 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qhq8x" event={"ID":"1a28689a-17c0-44ae-b07d-4b23fd1ce70a","Type":"ContainerStarted","Data":"7fcb47e81f9e7a6cdc8a98608a8682b05f8a9ef490eb7b9d185032288f09f93c"} Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.156611 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qhq8x" podStartSLOduration=14.156564549 podStartE2EDuration="14.156564549s" podCreationTimestamp="2025-06-13 05:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:42.138100969 +0000 UTC m=+900.584348432" watchObservedRunningTime="2025-06-13 05:05:42.156564549 +0000 UTC m=+900.602812012" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.206408 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.208554 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.226841 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.238723 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q7jk8" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.238817 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.240689 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.244631 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.293824 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.293893 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.293941 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.293974 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.293998 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.294019 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.294062 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8jb\" (UniqueName: \"kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.294082 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.294105 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvhc\" (UniqueName: \"kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.294129 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.395571 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.395627 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.395682 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.395705 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.395723 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.396507 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.396557 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397435 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397478 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8jb\" (UniqueName: \"kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397497 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397535 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvhc\" (UniqueName: \"kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397564 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.397609 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.399130 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.406164 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.406212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.407003 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.410587 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.417796 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvhc\" (UniqueName: \"kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc\") pod \"neutron-685f75758b-gbvvw\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.419171 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8jb\" (UniqueName: \"kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb\") pod \"dnsmasq-dns-5b59768489-skrdq\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.539738 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.680174 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:42 crc kubenswrapper[4894]: I0613 05:05:42.969788 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:05:42 crc kubenswrapper[4894]: W0613 05:05:42.978251 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2968fe1b_db73_45f4_974f_cab909b022f3.slice/crio-a20ce62aac6e55da936f1b1e775ae543b84d40556ec57a1f1aaef1be17f54aeb WatchSource:0}: Error finding container a20ce62aac6e55da936f1b1e775ae543b84d40556ec57a1f1aaef1be17f54aeb: Status 404 returned error can't find the container with id a20ce62aac6e55da936f1b1e775ae543b84d40556ec57a1f1aaef1be17f54aeb Jun 13 05:05:43 crc kubenswrapper[4894]: I0613 05:05:43.108316 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:05:43 crc kubenswrapper[4894]: W0613 05:05:43.116823 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b704f8_037e_4d44_a89d_d8a7ae539b13.slice/crio-560839a5b0de45eb04d6a09df10142d0d510a0c49a69d6c4fca6f7ce68c3e61e WatchSource:0}: Error finding container 560839a5b0de45eb04d6a09df10142d0d510a0c49a69d6c4fca6f7ce68c3e61e: Status 404 returned error can't find the container with id 560839a5b0de45eb04d6a09df10142d0d510a0c49a69d6c4fca6f7ce68c3e61e Jun 13 05:05:43 crc kubenswrapper[4894]: I0613 05:05:43.122521 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerStarted","Data":"9c433512ce23030dba6b3b11e29dd584db2a13ad00911ae1ecdffef1fb8c539c"} Jun 13 05:05:43 crc kubenswrapper[4894]: I0613 05:05:43.125177 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b59768489-skrdq" event={"ID":"2968fe1b-db73-45f4-974f-cab909b022f3","Type":"ContainerStarted","Data":"a20ce62aac6e55da936f1b1e775ae543b84d40556ec57a1f1aaef1be17f54aeb"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.133734 4894 generic.go:334] "Generic (PLEG): container finished" podID="6bcc604a-93b6-4aca-bbca-0b078378889d" containerID="4de4772a609b7651df82f3f443c5f6187453e7fc6f7e9559ee485579979ba94b" exitCode=0 Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.134948 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgk85" event={"ID":"6bcc604a-93b6-4aca-bbca-0b078378889d","Type":"ContainerDied","Data":"4de4772a609b7651df82f3f443c5f6187453e7fc6f7e9559ee485579979ba94b"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.137900 4894 generic.go:334] "Generic (PLEG): container finished" podID="2968fe1b-db73-45f4-974f-cab909b022f3" containerID="6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae" exitCode=0 Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.138003 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b59768489-skrdq" event={"ID":"2968fe1b-db73-45f4-974f-cab909b022f3","Type":"ContainerDied","Data":"6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.142045 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerStarted","Data":"970793b09107d578520bb735bb37f72b83e1a346984ff9caaaf079a70eab3137"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.142140 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerStarted","Data":"4de1d4603802db591a776d4762fd23358d4b55e965d40595af3aa363a9e3cc11"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.142196 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerStarted","Data":"560839a5b0de45eb04d6a09df10142d0d510a0c49a69d6c4fca6f7ce68c3e61e"} Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.841452 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-554d559d55-rtnwg"] Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.848494 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.851557 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.851779 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.855146 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-554d559d55-rtnwg"] Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940618 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-httpd-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940680 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-ovndb-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940721 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-public-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940737 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qdh2\" (UniqueName: \"kubernetes.io/projected/9401dca8-385e-4849-abb9-38059dd2ae63-kube-api-access-8qdh2\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940756 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940773 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-internal-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:44 crc kubenswrapper[4894]: I0613 05:05:44.940814 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-combined-ca-bundle\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.041938 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-httpd-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.041978 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-ovndb-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.042012 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-public-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.042028 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qdh2\" (UniqueName: \"kubernetes.io/projected/9401dca8-385e-4849-abb9-38059dd2ae63-kube-api-access-8qdh2\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.042044 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.042063 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-internal-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.042109 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-combined-ca-bundle\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.047404 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-ovndb-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.048209 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-combined-ca-bundle\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.048774 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-internal-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.062190 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-httpd-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.062433 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-public-tls-certs\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.062484 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qdh2\" (UniqueName: \"kubernetes.io/projected/9401dca8-385e-4849-abb9-38059dd2ae63-kube-api-access-8qdh2\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.063123 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9401dca8-385e-4849-abb9-38059dd2ae63-config\") pod \"neutron-554d559d55-rtnwg\" (UID: \"9401dca8-385e-4849-abb9-38059dd2ae63\") " pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.156156 4894 generic.go:334] "Generic (PLEG): container finished" podID="1a28689a-17c0-44ae-b07d-4b23fd1ce70a" containerID="7fcb47e81f9e7a6cdc8a98608a8682b05f8a9ef490eb7b9d185032288f09f93c" exitCode=0 Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.157022 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qhq8x" event={"ID":"1a28689a-17c0-44ae-b07d-4b23fd1ce70a","Type":"ContainerDied","Data":"7fcb47e81f9e7a6cdc8a98608a8682b05f8a9ef490eb7b9d185032288f09f93c"} Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.157252 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.171544 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.193954 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-685f75758b-gbvvw" podStartSLOduration=3.193934384 podStartE2EDuration="3.193934384s" podCreationTimestamp="2025-06-13 05:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:45.184962361 +0000 UTC m=+903.631209824" watchObservedRunningTime="2025-06-13 05:05:45.193934384 +0000 UTC m=+903.640181847" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.509865 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.549884 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs\") pod \"6bcc604a-93b6-4aca-bbca-0b078378889d\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.549945 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data\") pod \"6bcc604a-93b6-4aca-bbca-0b078378889d\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.549973 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts\") pod \"6bcc604a-93b6-4aca-bbca-0b078378889d\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.550040 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnj2x\" (UniqueName: \"kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x\") pod \"6bcc604a-93b6-4aca-bbca-0b078378889d\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.550065 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle\") pod \"6bcc604a-93b6-4aca-bbca-0b078378889d\" (UID: \"6bcc604a-93b6-4aca-bbca-0b078378889d\") " Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.558024 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts" (OuterVolumeSpecName: "scripts") pod "6bcc604a-93b6-4aca-bbca-0b078378889d" (UID: "6bcc604a-93b6-4aca-bbca-0b078378889d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.558185 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs" (OuterVolumeSpecName: "logs") pod "6bcc604a-93b6-4aca-bbca-0b078378889d" (UID: "6bcc604a-93b6-4aca-bbca-0b078378889d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.583611 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x" (OuterVolumeSpecName: "kube-api-access-dnj2x") pod "6bcc604a-93b6-4aca-bbca-0b078378889d" (UID: "6bcc604a-93b6-4aca-bbca-0b078378889d"). InnerVolumeSpecName "kube-api-access-dnj2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.600832 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data" (OuterVolumeSpecName: "config-data") pod "6bcc604a-93b6-4aca-bbca-0b078378889d" (UID: "6bcc604a-93b6-4aca-bbca-0b078378889d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.627588 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bcc604a-93b6-4aca-bbca-0b078378889d" (UID: "6bcc604a-93b6-4aca-bbca-0b078378889d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.657729 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnj2x\" (UniqueName: \"kubernetes.io/projected/6bcc604a-93b6-4aca-bbca-0b078378889d-kube-api-access-dnj2x\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.657759 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.657769 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bcc604a-93b6-4aca-bbca-0b078378889d-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.657780 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.657789 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bcc604a-93b6-4aca-bbca-0b078378889d-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:45 crc kubenswrapper[4894]: I0613 05:05:45.867288 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-554d559d55-rtnwg"] Jun 13 05:05:45 crc kubenswrapper[4894]: W0613 05:05:45.894062 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9401dca8_385e_4849_abb9_38059dd2ae63.slice/crio-c21b13985756194f1529c0bc33da46ffe011d6635ac682aca9d53bb89a4d28a1 WatchSource:0}: Error finding container c21b13985756194f1529c0bc33da46ffe011d6635ac682aca9d53bb89a4d28a1: Status 404 returned error can't find the container with id c21b13985756194f1529c0bc33da46ffe011d6635ac682aca9d53bb89a4d28a1 Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.164046 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cgk85" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.164064 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cgk85" event={"ID":"6bcc604a-93b6-4aca-bbca-0b078378889d","Type":"ContainerDied","Data":"b961826eb3b6d2699ec12179eb294cf2464201a10814fc7e9fa678d8677802f1"} Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.164447 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b961826eb3b6d2699ec12179eb294cf2464201a10814fc7e9fa678d8677802f1" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.167641 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-554d559d55-rtnwg" event={"ID":"9401dca8-385e-4849-abb9-38059dd2ae63","Type":"ContainerStarted","Data":"c21b13985756194f1529c0bc33da46ffe011d6635ac682aca9d53bb89a4d28a1"} Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.431497 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.576751 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.576854 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzbpg\" (UniqueName: \"kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.576911 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.576938 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.576990 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.577008 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle\") pod \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\" (UID: \"1a28689a-17c0-44ae-b07d-4b23fd1ce70a\") " Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.597948 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg" (OuterVolumeSpecName: "kube-api-access-dzbpg") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "kube-api-access-dzbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.606492 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.606599 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts" (OuterVolumeSpecName: "scripts") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.619849 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.669390 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.679405 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56495d6d8b-4hpz7"] Jun 13 05:05:46 crc kubenswrapper[4894]: E0613 05:05:46.680555 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a28689a-17c0-44ae-b07d-4b23fd1ce70a" containerName="keystone-bootstrap" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.680570 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a28689a-17c0-44ae-b07d-4b23fd1ce70a" containerName="keystone-bootstrap" Jun 13 05:05:46 crc kubenswrapper[4894]: E0613 05:05:46.680607 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcc604a-93b6-4aca-bbca-0b078378889d" containerName="placement-db-sync" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.680613 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcc604a-93b6-4aca-bbca-0b078378889d" containerName="placement-db-sync" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.680886 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a28689a-17c0-44ae-b07d-4b23fd1ce70a" containerName="keystone-bootstrap" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.680917 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bcc604a-93b6-4aca-bbca-0b078378889d" containerName="placement-db-sync" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682148 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682797 4894 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682842 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzbpg\" (UniqueName: \"kubernetes.io/projected/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-kube-api-access-dzbpg\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682854 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682863 4894 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-credential-keys\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.682871 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.688177 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.688468 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.692174 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8d74c" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.701971 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.702345 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.732023 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56495d6d8b-4hpz7"] Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.732304 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data" (OuterVolumeSpecName: "config-data") pod "1a28689a-17c0-44ae-b07d-4b23fd1ce70a" (UID: "1a28689a-17c0-44ae-b07d-4b23fd1ce70a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.785532 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-combined-ca-bundle\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-config-data\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786118 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-scripts\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786182 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-internal-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786319 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-public-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786348 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-528jd\" (UniqueName: \"kubernetes.io/projected/8e69886e-0c5c-4f2b-b479-fd600873129b-kube-api-access-528jd\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786378 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e69886e-0c5c-4f2b-b479-fd600873129b-logs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.786433 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28689a-17c0-44ae-b07d-4b23fd1ce70a-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.888048 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-internal-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.888412 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-public-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.888456 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-528jd\" (UniqueName: \"kubernetes.io/projected/8e69886e-0c5c-4f2b-b479-fd600873129b-kube-api-access-528jd\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.888483 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e69886e-0c5c-4f2b-b479-fd600873129b-logs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.889198 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-combined-ca-bundle\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.889279 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-config-data\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.889313 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-scripts\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.893980 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e69886e-0c5c-4f2b-b479-fd600873129b-logs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.895554 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-public-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.895590 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-internal-tls-certs\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.895680 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-config-data\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.900629 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-scripts\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.901407 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69886e-0c5c-4f2b-b479-fd600873129b-combined-ca-bundle\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:46 crc kubenswrapper[4894]: I0613 05:05:46.909759 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-528jd\" (UniqueName: \"kubernetes.io/projected/8e69886e-0c5c-4f2b-b479-fd600873129b-kube-api-access-528jd\") pod \"placement-56495d6d8b-4hpz7\" (UID: \"8e69886e-0c5c-4f2b-b479-fd600873129b\") " pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.087459 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.181223 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qhq8x" event={"ID":"1a28689a-17c0-44ae-b07d-4b23fd1ce70a","Type":"ContainerDied","Data":"902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad"} Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.181247 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qhq8x" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.181268 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="902dbc19c6047ad9d01e58e88668980ddf379299fe3ddaf256ee8b88637e89ad" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.183986 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b59768489-skrdq" event={"ID":"2968fe1b-db73-45f4-974f-cab909b022f3","Type":"ContainerStarted","Data":"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5"} Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.184114 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.186429 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-554d559d55-rtnwg" event={"ID":"9401dca8-385e-4849-abb9-38059dd2ae63","Type":"ContainerStarted","Data":"d056f6b10a788c73ba4466efd9ac4c7a75d980044be619f369213def294a5417"} Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.204409 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b59768489-skrdq" podStartSLOduration=5.204394171 podStartE2EDuration="5.204394171s" podCreationTimestamp="2025-06-13 05:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:47.200071089 +0000 UTC m=+905.646318552" watchObservedRunningTime="2025-06-13 05:05:47.204394171 +0000 UTC m=+905.650641634" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.348504 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6bfbcbb6c7-q4p46"] Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.351080 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.353023 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.353274 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.353449 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.353554 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.353686 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2nhqr" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.354017 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.367643 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bfbcbb6c7-q4p46"] Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500410 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-public-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500455 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-fernet-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500473 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-combined-ca-bundle\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500492 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-internal-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500520 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-scripts\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500546 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-config-data\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500567 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n88q\" (UniqueName: \"kubernetes.io/projected/0e483628-45ae-49b7-bb58-abfeda32d6c0-kube-api-access-4n88q\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.500607 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-credential-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601601 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-public-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601667 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-fernet-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601689 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-combined-ca-bundle\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601705 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-internal-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601729 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-scripts\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601755 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-config-data\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601776 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n88q\" (UniqueName: \"kubernetes.io/projected/0e483628-45ae-49b7-bb58-abfeda32d6c0-kube-api-access-4n88q\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.601826 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-credential-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.607349 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-fernet-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.608534 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-credential-keys\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.608686 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-internal-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.615083 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-scripts\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.621484 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n88q\" (UniqueName: \"kubernetes.io/projected/0e483628-45ae-49b7-bb58-abfeda32d6c0-kube-api-access-4n88q\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.623889 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-combined-ca-bundle\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.624408 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-public-tls-certs\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.630277 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e483628-45ae-49b7-bb58-abfeda32d6c0-config-data\") pod \"keystone-6bfbcbb6c7-q4p46\" (UID: \"0e483628-45ae-49b7-bb58-abfeda32d6c0\") " pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:47 crc kubenswrapper[4894]: I0613 05:05:47.671009 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:50 crc kubenswrapper[4894]: I0613 05:05:50.912061 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56495d6d8b-4hpz7"] Jun 13 05:05:50 crc kubenswrapper[4894]: W0613 05:05:50.919486 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e69886e_0c5c_4f2b_b479_fd600873129b.slice/crio-6fb17385c41375889f877cb9cf92db5ff159494f1fcfaebb5032d34172f7d33e WatchSource:0}: Error finding container 6fb17385c41375889f877cb9cf92db5ff159494f1fcfaebb5032d34172f7d33e: Status 404 returned error can't find the container with id 6fb17385c41375889f877cb9cf92db5ff159494f1fcfaebb5032d34172f7d33e Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.051937 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bfbcbb6c7-q4p46"] Jun 13 05:05:51 crc kubenswrapper[4894]: W0613 05:05:51.059077 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e483628_45ae_49b7_bb58_abfeda32d6c0.slice/crio-8796e427081b2511df4deeecc464fbb96d57423ef9dd5b8706c9785afb504cac WatchSource:0}: Error finding container 8796e427081b2511df4deeecc464fbb96d57423ef9dd5b8706c9785afb504cac: Status 404 returned error can't find the container with id 8796e427081b2511df4deeecc464fbb96d57423ef9dd5b8706c9785afb504cac Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.223255 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerStarted","Data":"80f9eb4af478c6303cd74a58162fac7be113efa818725736ee0ee603013f0c77"} Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.224559 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bfbcbb6c7-q4p46" event={"ID":"0e483628-45ae-49b7-bb58-abfeda32d6c0","Type":"ContainerStarted","Data":"8796e427081b2511df4deeecc464fbb96d57423ef9dd5b8706c9785afb504cac"} Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.226153 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-554d559d55-rtnwg" event={"ID":"9401dca8-385e-4849-abb9-38059dd2ae63","Type":"ContainerStarted","Data":"101db9a9b23e385668e3d02613752746b50ab0dbcab543b6649e9df45556dc65"} Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.226769 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.229679 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56495d6d8b-4hpz7" event={"ID":"8e69886e-0c5c-4f2b-b479-fd600873129b","Type":"ContainerStarted","Data":"ad96a1c6faa798264a205e65ed4d342e4d0aa9dd9bc5577777c82927920916cc"} Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.229707 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56495d6d8b-4hpz7" event={"ID":"8e69886e-0c5c-4f2b-b479-fd600873129b","Type":"ContainerStarted","Data":"6fb17385c41375889f877cb9cf92db5ff159494f1fcfaebb5032d34172f7d33e"} Jun 13 05:05:51 crc kubenswrapper[4894]: I0613 05:05:51.248588 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-554d559d55-rtnwg" podStartSLOduration=7.248572539 podStartE2EDuration="7.248572539s" podCreationTimestamp="2025-06-13 05:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:51.243152367 +0000 UTC m=+909.689399830" watchObservedRunningTime="2025-06-13 05:05:51.248572539 +0000 UTC m=+909.694820002" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.257453 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56495d6d8b-4hpz7" event={"ID":"8e69886e-0c5c-4f2b-b479-fd600873129b","Type":"ContainerStarted","Data":"098417b6825322cb051e5b9ffc3a552f7492180ea4f788cff99c38f4cbe47c53"} Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.258676 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.258702 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.263683 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bfbcbb6c7-q4p46" event={"ID":"0e483628-45ae-49b7-bb58-abfeda32d6c0","Type":"ContainerStarted","Data":"2f5479c995947f2f4f0284c527d3d11f114e17f08d32114606d789d36a0a86a4"} Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.276169 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56495d6d8b-4hpz7" podStartSLOduration=6.276148906 podStartE2EDuration="6.276148906s" podCreationTimestamp="2025-06-13 05:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:52.273196023 +0000 UTC m=+910.719443486" watchObservedRunningTime="2025-06-13 05:05:52.276148906 +0000 UTC m=+910.722396379" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.299176 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6bfbcbb6c7-q4p46" podStartSLOduration=5.299153564 podStartE2EDuration="5.299153564s" podCreationTimestamp="2025-06-13 05:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:05:52.298985809 +0000 UTC m=+910.745233292" watchObservedRunningTime="2025-06-13 05:05:52.299153564 +0000 UTC m=+910.745401027" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.681444 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.736086 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:52 crc kubenswrapper[4894]: I0613 05:05:52.736277 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="dnsmasq-dns" containerID="cri-o://f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268" gracePeriod=10 Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.197579 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.217142 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config\") pod \"7c26ac92-294c-4929-a955-c6184b543538\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.217177 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb\") pod \"7c26ac92-294c-4929-a955-c6184b543538\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.217193 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb\") pod \"7c26ac92-294c-4929-a955-c6184b543538\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.217252 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtntp\" (UniqueName: \"kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp\") pod \"7c26ac92-294c-4929-a955-c6184b543538\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.217270 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc\") pod \"7c26ac92-294c-4929-a955-c6184b543538\" (UID: \"7c26ac92-294c-4929-a955-c6184b543538\") " Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.241412 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp" (OuterVolumeSpecName: "kube-api-access-gtntp") pod "7c26ac92-294c-4929-a955-c6184b543538" (UID: "7c26ac92-294c-4929-a955-c6184b543538"). InnerVolumeSpecName "kube-api-access-gtntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.257775 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7c26ac92-294c-4929-a955-c6184b543538" (UID: "7c26ac92-294c-4929-a955-c6184b543538"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.261378 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c26ac92-294c-4929-a955-c6184b543538" (UID: "7c26ac92-294c-4929-a955-c6184b543538"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.270066 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config" (OuterVolumeSpecName: "config") pod "7c26ac92-294c-4929-a955-c6184b543538" (UID: "7c26ac92-294c-4929-a955-c6184b543538"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.280926 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gq5rk" event={"ID":"fcfc55e9-b62b-4d38-8e72-4cf04ba09524","Type":"ContainerStarted","Data":"2074362eadcf5583824db0302d6442b4eb4183c1ecdce908429422e70357582f"} Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.281756 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c26ac92-294c-4929-a955-c6184b543538" (UID: "7c26ac92-294c-4929-a955-c6184b543538"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.285550 4894 generic.go:334] "Generic (PLEG): container finished" podID="7c26ac92-294c-4929-a955-c6184b543538" containerID="f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268" exitCode=0 Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.286003 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.294051 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" event={"ID":"7c26ac92-294c-4929-a955-c6184b543538","Type":"ContainerDied","Data":"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268"} Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.294086 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.294096 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d95cb594c-jd78r" event={"ID":"7c26ac92-294c-4929-a955-c6184b543538","Type":"ContainerDied","Data":"c0bd2f1c0dab86cbc5fe4e7b5e1fe41297c80ee729061472b3dd1552600be034"} Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.294114 4894 scope.go:117] "RemoveContainer" containerID="f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.313059 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gq5rk" podStartSLOduration=2.076343443 podStartE2EDuration="34.313041266s" podCreationTimestamp="2025-06-13 05:05:19 +0000 UTC" firstStartedPulling="2025-06-13 05:05:20.529201985 +0000 UTC m=+878.975449448" lastFinishedPulling="2025-06-13 05:05:52.765899808 +0000 UTC m=+911.212147271" observedRunningTime="2025-06-13 05:05:53.299724231 +0000 UTC m=+911.745971694" watchObservedRunningTime="2025-06-13 05:05:53.313041266 +0000 UTC m=+911.759288729" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.317100 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.318776 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.318801 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.318810 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.318819 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtntp\" (UniqueName: \"kubernetes.io/projected/7c26ac92-294c-4929-a955-c6184b543538-kube-api-access-gtntp\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.318828 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c26ac92-294c-4929-a955-c6184b543538-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.323775 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d95cb594c-jd78r"] Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.337561 4894 scope.go:117] "RemoveContainer" containerID="036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.368718 4894 scope.go:117] "RemoveContainer" containerID="f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268" Jun 13 05:05:53 crc kubenswrapper[4894]: E0613 05:05:53.376074 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268\": container with ID starting with f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268 not found: ID does not exist" containerID="f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.376144 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268"} err="failed to get container status \"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268\": rpc error: code = NotFound desc = could not find container \"f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268\": container with ID starting with f60bc0248e2897ab852d3ff3bcf70c7ad4a333cfc788f50d33b0aacf43f85268 not found: ID does not exist" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.376187 4894 scope.go:117] "RemoveContainer" containerID="036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5" Jun 13 05:05:53 crc kubenswrapper[4894]: E0613 05:05:53.376704 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5\": container with ID starting with 036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5 not found: ID does not exist" containerID="036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5" Jun 13 05:05:53 crc kubenswrapper[4894]: I0613 05:05:53.376731 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5"} err="failed to get container status \"036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5\": rpc error: code = NotFound desc = could not find container \"036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5\": container with ID starting with 036a1609ba985dea20aaefa3d0799fa446a9ca7aefdc7c76768e7d8f1d432fb5 not found: ID does not exist" Jun 13 05:05:54 crc kubenswrapper[4894]: I0613 05:05:54.286978 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c26ac92-294c-4929-a955-c6184b543538" path="/var/lib/kubelet/pods/7c26ac92-294c-4929-a955-c6184b543538/volumes" Jun 13 05:05:56 crc kubenswrapper[4894]: I0613 05:05:56.236064 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:05:56 crc kubenswrapper[4894]: I0613 05:05:56.236332 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:05:56 crc kubenswrapper[4894]: I0613 05:05:56.314558 4894 generic.go:334] "Generic (PLEG): container finished" podID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" containerID="2074362eadcf5583824db0302d6442b4eb4183c1ecdce908429422e70357582f" exitCode=0 Jun 13 05:05:56 crc kubenswrapper[4894]: I0613 05:05:56.314683 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gq5rk" event={"ID":"fcfc55e9-b62b-4d38-8e72-4cf04ba09524","Type":"ContainerDied","Data":"2074362eadcf5583824db0302d6442b4eb4183c1ecdce908429422e70357582f"} Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.804460 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.924431 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data\") pod \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.924536 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle\") pod \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.924562 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q5rk\" (UniqueName: \"kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk\") pod \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\" (UID: \"fcfc55e9-b62b-4d38-8e72-4cf04ba09524\") " Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.959106 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk" (OuterVolumeSpecName: "kube-api-access-7q5rk") pod "fcfc55e9-b62b-4d38-8e72-4cf04ba09524" (UID: "fcfc55e9-b62b-4d38-8e72-4cf04ba09524"). InnerVolumeSpecName "kube-api-access-7q5rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.959923 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fcfc55e9-b62b-4d38-8e72-4cf04ba09524" (UID: "fcfc55e9-b62b-4d38-8e72-4cf04ba09524"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:58 crc kubenswrapper[4894]: I0613 05:05:58.983286 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcfc55e9-b62b-4d38-8e72-4cf04ba09524" (UID: "fcfc55e9-b62b-4d38-8e72-4cf04ba09524"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.029630 4894 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.029749 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.029808 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q5rk\" (UniqueName: \"kubernetes.io/projected/fcfc55e9-b62b-4d38-8e72-4cf04ba09524-kube-api-access-7q5rk\") on node \"crc\" DevicePath \"\"" Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.345926 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gq5rk" event={"ID":"fcfc55e9-b62b-4d38-8e72-4cf04ba09524","Type":"ContainerDied","Data":"0ca194a4cee2a86a4538224c773c32065828d8667d949978eb1356233ef1f9d2"} Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.345959 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca194a4cee2a86a4538224c773c32065828d8667d949978eb1356233ef1f9d2" Jun 13 05:05:59 crc kubenswrapper[4894]: I0613 05:05:59.346022 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gq5rk" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125174 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6688589669-t4pqd"] Jun 13 05:06:00 crc kubenswrapper[4894]: E0613 05:06:00.125747 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="init" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125758 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="init" Jun 13 05:06:00 crc kubenswrapper[4894]: E0613 05:06:00.125771 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" containerName="barbican-db-sync" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125777 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" containerName="barbican-db-sync" Jun 13 05:06:00 crc kubenswrapper[4894]: E0613 05:06:00.125794 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="dnsmasq-dns" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125803 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="dnsmasq-dns" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125960 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c26ac92-294c-4929-a955-c6184b543538" containerName="dnsmasq-dns" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.125977 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" containerName="barbican-db-sync" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.126753 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.129232 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.129725 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.148446 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rfw5b" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.155238 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.155490 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-combined-ca-bundle\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.155599 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baefe546-6c9e-41e3-a02c-2a5123bea0aa-logs\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.155705 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47hx5\" (UniqueName: \"kubernetes.io/projected/baefe546-6c9e-41e3-a02c-2a5123bea0aa-kube-api-access-47hx5\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.155787 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data-custom\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.170234 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6688589669-t4pqd"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.220734 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-688dbd77d4-tjxfc"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.222072 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.225495 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.233874 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-688dbd77d4-tjxfc"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258157 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258204 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfsp5\" (UniqueName: \"kubernetes.io/projected/26738de9-74f2-430a-a82d-ff0d7ec8e28f-kube-api-access-lfsp5\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258248 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-combined-ca-bundle\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258304 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26738de9-74f2-430a-a82d-ff0d7ec8e28f-logs\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258324 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baefe546-6c9e-41e3-a02c-2a5123bea0aa-logs\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258354 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47hx5\" (UniqueName: \"kubernetes.io/projected/baefe546-6c9e-41e3-a02c-2a5123bea0aa-kube-api-access-47hx5\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258506 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data-custom\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258530 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-combined-ca-bundle\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258560 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data-custom\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.258585 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.266083 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baefe546-6c9e-41e3-a02c-2a5123bea0aa-logs\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.277880 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.293189 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-combined-ca-bundle\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.293278 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baefe546-6c9e-41e3-a02c-2a5123bea0aa-config-data-custom\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.320135 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47hx5\" (UniqueName: \"kubernetes.io/projected/baefe546-6c9e-41e3-a02c-2a5123bea0aa-kube-api-access-47hx5\") pod \"barbican-worker-6688589669-t4pqd\" (UID: \"baefe546-6c9e-41e3-a02c-2a5123bea0aa\") " pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.350575 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.351896 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.351974 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361431 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26738de9-74f2-430a-a82d-ff0d7ec8e28f-logs\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361509 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-combined-ca-bundle\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361529 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data-custom\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361574 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361591 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfsp5\" (UniqueName: \"kubernetes.io/projected/26738de9-74f2-430a-a82d-ff0d7ec8e28f-kube-api-access-lfsp5\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.361913 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.363201 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.366600 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data-custom\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.366899 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.366922 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-combined-ca-bundle\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.371110 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26738de9-74f2-430a-a82d-ff0d7ec8e28f-config-data\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.371912 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26738de9-74f2-430a-a82d-ff0d7ec8e28f-logs\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.389531 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfsp5\" (UniqueName: \"kubernetes.io/projected/26738de9-74f2-430a-a82d-ff0d7ec8e28f-kube-api-access-lfsp5\") pod \"barbican-keystone-listener-688dbd77d4-tjxfc\" (UID: \"26738de9-74f2-430a-a82d-ff0d7ec8e28f\") " pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.406550 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.452700 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6688589669-t4pqd" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464642 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464695 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464769 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464786 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464800 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464828 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmjq\" (UniqueName: \"kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464856 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mx6f\" (UniqueName: \"kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464907 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.464969 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.465022 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.566699 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567166 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567230 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567250 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567296 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567315 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567328 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567346 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmjq\" (UniqueName: \"kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567370 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mx6f\" (UniqueName: \"kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.567427 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.568396 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.568842 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.571291 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.571449 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.572312 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.572373 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.572430 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.572775 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.574687 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.584445 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmjq\" (UniqueName: \"kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq\") pod \"barbican-api-84b4c54c78-v656z\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.591199 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mx6f\" (UniqueName: \"kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f\") pod \"dnsmasq-dns-6dbb4b98fc-7zr69\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.700313 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.722407 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:00 crc kubenswrapper[4894]: I0613 05:06:00.937920 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6688589669-t4pqd"] Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.039330 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-688dbd77d4-tjxfc"] Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.354024 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.360865 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:01 crc kubenswrapper[4894]: W0613 05:06:01.361201 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ca8d069_f049_48a0_baa9_12828cb5a77e.slice/crio-7f297c2daab27f445d2058b2c2c0d72782ce67c486320380f981c68e29136030 WatchSource:0}: Error finding container 7f297c2daab27f445d2058b2c2c0d72782ce67c486320380f981c68e29136030: Status 404 returned error can't find the container with id 7f297c2daab27f445d2058b2c2c0d72782ce67c486320380f981c68e29136030 Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.431035 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerStarted","Data":"989d0f7cbe4caadc36cd6b78c9b59d5052af19b80d54a17ddad6aec565ea2334"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.433238 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-czlhd" event={"ID":"e8402629-c5ed-4482-9a1c-bdf5caaa2a21","Type":"ContainerStarted","Data":"e9a6e357fe34bfe6ad66fd02bbb1d82712e7ecc6779201673c409dac2488e220"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.436525 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" event={"ID":"26738de9-74f2-430a-a82d-ff0d7ec8e28f","Type":"ContainerStarted","Data":"762addad1f86b130256f73d2d1e35d844efb8156af6450e54c56284cdf3e1c96"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438085 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerStarted","Data":"1e2aee83dd7d837efd4f3c5fe8255106941e4d815a71c99a66bf00e1740b3f6e"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438333 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-central-agent" containerID="cri-o://57af36af8daa20b8e5ce925628216b15eeaf4095f1f2f01f2a4ff2aaa60b61ed" gracePeriod=30 Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438478 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438469 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="sg-core" containerID="cri-o://80f9eb4af478c6303cd74a58162fac7be113efa818725736ee0ee603013f0c77" gracePeriod=30 Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438509 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="proxy-httpd" containerID="cri-o://1e2aee83dd7d837efd4f3c5fe8255106941e4d815a71c99a66bf00e1740b3f6e" gracePeriod=30 Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.438560 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-notification-agent" containerID="cri-o://9c433512ce23030dba6b3b11e29dd584db2a13ad00911ae1ecdffef1fb8c539c" gracePeriod=30 Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.441753 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6688589669-t4pqd" event={"ID":"baefe546-6c9e-41e3-a02c-2a5123bea0aa","Type":"ContainerStarted","Data":"88337466f2f59e79f7f1572286d8f3e53a09e94412ae2270eb3cfd0045b58d94"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.445875 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" event={"ID":"4ca8d069-f049-48a0-baa9-12828cb5a77e","Type":"ContainerStarted","Data":"7f297c2daab27f445d2058b2c2c0d72782ce67c486320380f981c68e29136030"} Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.459483 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-czlhd" podStartSLOduration=2.807120513 podStartE2EDuration="42.459464156s" podCreationTimestamp="2025-06-13 05:05:19 +0000 UTC" firstStartedPulling="2025-06-13 05:05:20.247329107 +0000 UTC m=+878.693576571" lastFinishedPulling="2025-06-13 05:05:59.899672751 +0000 UTC m=+918.345920214" observedRunningTime="2025-06-13 05:06:01.453132688 +0000 UTC m=+919.899380161" watchObservedRunningTime="2025-06-13 05:06:01.459464156 +0000 UTC m=+919.905711619" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.475401 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-xz868"] Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.477042 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.478947 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.490447 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.740638712 podStartE2EDuration="41.490433509s" podCreationTimestamp="2025-06-13 05:05:20 +0000 UTC" firstStartedPulling="2025-06-13 05:05:21.171563785 +0000 UTC m=+879.617811248" lastFinishedPulling="2025-06-13 05:05:59.921358582 +0000 UTC m=+918.367606045" observedRunningTime="2025-06-13 05:06:01.484222054 +0000 UTC m=+919.930469517" watchObservedRunningTime="2025-06-13 05:06:01.490433509 +0000 UTC m=+919.936680972" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.612911 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmrr\" (UniqueName: \"kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.612982 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.713970 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmrr\" (UniqueName: \"kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.714042 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.714183 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.754197 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmrr\" (UniqueName: \"kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr\") pod \"crc-debug-xz868\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " pod="openstack/crc-debug-xz868" Jun 13 05:06:01 crc kubenswrapper[4894]: I0613 05:06:01.842984 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xz868" Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460177 4894 generic.go:334] "Generic (PLEG): container finished" podID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerID="1e2aee83dd7d837efd4f3c5fe8255106941e4d815a71c99a66bf00e1740b3f6e" exitCode=0 Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460398 4894 generic.go:334] "Generic (PLEG): container finished" podID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerID="80f9eb4af478c6303cd74a58162fac7be113efa818725736ee0ee603013f0c77" exitCode=2 Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460418 4894 generic.go:334] "Generic (PLEG): container finished" podID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerID="57af36af8daa20b8e5ce925628216b15eeaf4095f1f2f01f2a4ff2aaa60b61ed" exitCode=0 Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460224 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerDied","Data":"1e2aee83dd7d837efd4f3c5fe8255106941e4d815a71c99a66bf00e1740b3f6e"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460480 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerDied","Data":"80f9eb4af478c6303cd74a58162fac7be113efa818725736ee0ee603013f0c77"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.460494 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerDied","Data":"57af36af8daa20b8e5ce925628216b15eeaf4095f1f2f01f2a4ff2aaa60b61ed"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.462464 4894 generic.go:334] "Generic (PLEG): container finished" podID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerID="54161dd96655ee1110a683146b5ecc9d59f898e6401fc088d3822827ce58b197" exitCode=0 Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.462533 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" event={"ID":"4ca8d069-f049-48a0-baa9-12828cb5a77e","Type":"ContainerDied","Data":"54161dd96655ee1110a683146b5ecc9d59f898e6401fc088d3822827ce58b197"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.465170 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerStarted","Data":"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.465194 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerStarted","Data":"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.465299 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.467203 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xz868" event={"ID":"6bfbbd42-1ddb-4298-82ce-6a40eed330e6","Type":"ContainerStarted","Data":"696f501fc60c5500010487a14b6bc383bad97386109a2590af76f8e14abe1b9a"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.467227 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xz868" event={"ID":"6bfbbd42-1ddb-4298-82ce-6a40eed330e6","Type":"ContainerStarted","Data":"f830dff36fa02ea5a0c7f88308313d2d27a46513c5a2a6bde84b09c1b5b9b980"} Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.516846 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-xz868" podStartSLOduration=1.516827623 podStartE2EDuration="1.516827623s" podCreationTimestamp="2025-06-13 05:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:02.494813983 +0000 UTC m=+920.941061446" watchObservedRunningTime="2025-06-13 05:06:02.516827623 +0000 UTC m=+920.963075076" Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.519520 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84b4c54c78-v656z" podStartSLOduration=2.519515549 podStartE2EDuration="2.519515549s" podCreationTimestamp="2025-06-13 05:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:02.516039381 +0000 UTC m=+920.962286844" watchObservedRunningTime="2025-06-13 05:06:02.519515549 +0000 UTC m=+920.965763012" Jun 13 05:06:02 crc kubenswrapper[4894]: I0613 05:06:02.995266 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bb5c5d86b-bl6pt"] Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.008434 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.015364 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.016757 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.036250 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bb5c5d86b-bl6pt"] Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.047921 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f363e21-6e30-43d0-a699-9f671b627544-logs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.047967 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgmz\" (UniqueName: \"kubernetes.io/projected/9f363e21-6e30-43d0-a699-9f671b627544-kube-api-access-ndgmz\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.047985 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-public-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.048025 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data-custom\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.048051 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-combined-ca-bundle\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.048074 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.048105 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-internal-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149468 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-internal-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149567 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f363e21-6e30-43d0-a699-9f671b627544-logs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149602 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgmz\" (UniqueName: \"kubernetes.io/projected/9f363e21-6e30-43d0-a699-9f671b627544-kube-api-access-ndgmz\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149625 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-public-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149756 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data-custom\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149803 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-combined-ca-bundle\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.149836 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.151031 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f363e21-6e30-43d0-a699-9f671b627544-logs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.154916 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.156263 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-public-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.156534 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-internal-tls-certs\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.156922 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-config-data-custom\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.161369 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f363e21-6e30-43d0-a699-9f671b627544-combined-ca-bundle\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.178265 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgmz\" (UniqueName: \"kubernetes.io/projected/9f363e21-6e30-43d0-a699-9f671b627544-kube-api-access-ndgmz\") pod \"barbican-api-5bb5c5d86b-bl6pt\" (UID: \"9f363e21-6e30-43d0-a699-9f671b627544\") " pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.441714 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:03 crc kubenswrapper[4894]: I0613 05:06:03.476026 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.367852 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-xz868"] Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.368712 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-xz868" podUID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" containerName="container-00" containerID="cri-o://696f501fc60c5500010487a14b6bc383bad97386109a2590af76f8e14abe1b9a" gracePeriod=2 Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.377676 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-xz868"] Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.388528 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bb5c5d86b-bl6pt"] Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.488216 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6688589669-t4pqd" event={"ID":"baefe546-6c9e-41e3-a02c-2a5123bea0aa","Type":"ContainerStarted","Data":"048c72e1bfe9b26585de554991fdbb9e287d6163b10403af411de1fe65122e6e"} Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.490645 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" event={"ID":"4ca8d069-f049-48a0-baa9-12828cb5a77e","Type":"ContainerStarted","Data":"a74216222ee96808ce987f7f33144afb0cc514dafcaad779c3e0f9bcaa5ac3c8"} Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.490787 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.494707 4894 generic.go:334] "Generic (PLEG): container finished" podID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" containerID="696f501fc60c5500010487a14b6bc383bad97386109a2590af76f8e14abe1b9a" exitCode=0 Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.494783 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f830dff36fa02ea5a0c7f88308313d2d27a46513c5a2a6bde84b09c1b5b9b980" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.500721 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" event={"ID":"9f363e21-6e30-43d0-a699-9f671b627544","Type":"ContainerStarted","Data":"89f5e36a233bbf16a6a83e7cbc58417c377cf5580613843be70f1a5bfc58e4bf"} Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.502321 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" event={"ID":"26738de9-74f2-430a-a82d-ff0d7ec8e28f","Type":"ContainerStarted","Data":"a31110f7d16202f41777d66e3a482dab0afefb8b4e741a4abd5a80cf50d088e1"} Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.508596 4894 generic.go:334] "Generic (PLEG): container finished" podID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerID="9c433512ce23030dba6b3b11e29dd584db2a13ad00911ae1ecdffef1fb8c539c" exitCode=0 Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.508842 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerDied","Data":"9c433512ce23030dba6b3b11e29dd584db2a13ad00911ae1ecdffef1fb8c539c"} Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.514632 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" podStartSLOduration=4.514613382 podStartE2EDuration="4.514613382s" podCreationTimestamp="2025-06-13 05:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:04.51243255 +0000 UTC m=+922.958680013" watchObservedRunningTime="2025-06-13 05:06:04.514613382 +0000 UTC m=+922.960860845" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.550809 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xz868" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.618968 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.636465 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmrr\" (UniqueName: \"kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr\") pod \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.636607 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host\") pod \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\" (UID: \"6bfbbd42-1ddb-4298-82ce-6a40eed330e6\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.641000 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host" (OuterVolumeSpecName: "host") pod "6bfbbd42-1ddb-4298-82ce-6a40eed330e6" (UID: "6bfbbd42-1ddb-4298-82ce-6a40eed330e6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.649756 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr" (OuterVolumeSpecName: "kube-api-access-glmrr") pod "6bfbbd42-1ddb-4298-82ce-6a40eed330e6" (UID: "6bfbbd42-1ddb-4298-82ce-6a40eed330e6"). InnerVolumeSpecName "kube-api-access-glmrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.738897 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fpsv\" (UniqueName: \"kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.738975 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739024 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739154 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739182 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739225 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739252 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle\") pod \"6b49d780-448b-4d39-be8c-4d711a72c12f\" (UID: \"6b49d780-448b-4d39-be8c-4d711a72c12f\") " Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739440 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739784 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.739964 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.740215 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmrr\" (UniqueName: \"kubernetes.io/projected/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-kube-api-access-glmrr\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.740290 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bfbbd42-1ddb-4298-82ce-6a40eed330e6-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.742523 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv" (OuterVolumeSpecName: "kube-api-access-2fpsv") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "kube-api-access-2fpsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.743600 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts" (OuterVolumeSpecName: "scripts") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.764797 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.824980 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.842034 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fpsv\" (UniqueName: \"kubernetes.io/projected/6b49d780-448b-4d39-be8c-4d711a72c12f-kube-api-access-2fpsv\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.842060 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.842072 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.842083 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b49d780-448b-4d39-be8c-4d711a72c12f-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.842091 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.843396 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data" (OuterVolumeSpecName: "config-data") pod "6b49d780-448b-4d39-be8c-4d711a72c12f" (UID: "6b49d780-448b-4d39-be8c-4d711a72c12f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:04 crc kubenswrapper[4894]: I0613 05:06:04.944077 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b49d780-448b-4d39-be8c-4d711a72c12f-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.518296 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" event={"ID":"9f363e21-6e30-43d0-a699-9f671b627544","Type":"ContainerStarted","Data":"e2dfc676896fe8fe46f5cf813c9a0ff7c8658313a4d074fe2b091a92771a5e94"} Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.518635 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" event={"ID":"9f363e21-6e30-43d0-a699-9f671b627544","Type":"ContainerStarted","Data":"2565d747da7b561910ba5a96491a7a7bcd72d7a40f02dacec37234a8feca7290"} Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.518662 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.521667 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" event={"ID":"26738de9-74f2-430a-a82d-ff0d7ec8e28f","Type":"ContainerStarted","Data":"11d6ca04c8f322b7864008d398255f29e551129e7716d3df618db9e74b19a4af"} Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.526452 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b49d780-448b-4d39-be8c-4d711a72c12f","Type":"ContainerDied","Data":"9c65826b6ad04e652bd9aeb884e7a0dd769ad4c2aa8f6783e29f0ff0c0be6cc7"} Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.526493 4894 scope.go:117] "RemoveContainer" containerID="1e2aee83dd7d837efd4f3c5fe8255106941e4d815a71c99a66bf00e1740b3f6e" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.526563 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.537436 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6688589669-t4pqd" event={"ID":"baefe546-6c9e-41e3-a02c-2a5123bea0aa","Type":"ContainerStarted","Data":"c94ea60fb3eedf776144f5353576f72a1e230c45827490eee857de19c41241b3"} Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.537499 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xz868" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.563199 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" podStartSLOduration=3.563182421 podStartE2EDuration="3.563182421s" podCreationTimestamp="2025-06-13 05:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:05.560960978 +0000 UTC m=+924.007208441" watchObservedRunningTime="2025-06-13 05:06:05.563182421 +0000 UTC m=+924.009429884" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.566287 4894 scope.go:117] "RemoveContainer" containerID="80f9eb4af478c6303cd74a58162fac7be113efa818725736ee0ee603013f0c77" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.586698 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-688dbd77d4-tjxfc" podStartSLOduration=2.818844588 podStartE2EDuration="5.586683352s" podCreationTimestamp="2025-06-13 05:06:00 +0000 UTC" firstStartedPulling="2025-06-13 05:06:01.049027548 +0000 UTC m=+919.495275011" lastFinishedPulling="2025-06-13 05:06:03.816866312 +0000 UTC m=+922.263113775" observedRunningTime="2025-06-13 05:06:05.578817871 +0000 UTC m=+924.025065334" watchObservedRunningTime="2025-06-13 05:06:05.586683352 +0000 UTC m=+924.032930815" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.592687 4894 scope.go:117] "RemoveContainer" containerID="9c433512ce23030dba6b3b11e29dd584db2a13ad00911ae1ecdffef1fb8c539c" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.609129 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6688589669-t4pqd" podStartSLOduration=2.685940656 podStartE2EDuration="5.609114114s" podCreationTimestamp="2025-06-13 05:06:00 +0000 UTC" firstStartedPulling="2025-06-13 05:06:00.919967804 +0000 UTC m=+919.366215267" lastFinishedPulling="2025-06-13 05:06:03.843141262 +0000 UTC m=+922.289388725" observedRunningTime="2025-06-13 05:06:05.605911744 +0000 UTC m=+924.052159207" watchObservedRunningTime="2025-06-13 05:06:05.609114114 +0000 UTC m=+924.055361577" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.622801 4894 scope.go:117] "RemoveContainer" containerID="57af36af8daa20b8e5ce925628216b15eeaf4095f1f2f01f2a4ff2aaa60b61ed" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.636682 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.643453 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.673907 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:05 crc kubenswrapper[4894]: E0613 05:06:05.674520 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-notification-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.674535 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-notification-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: E0613 05:06:05.674549 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="proxy-httpd" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.674555 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="proxy-httpd" Jun 13 05:06:05 crc kubenswrapper[4894]: E0613 05:06:05.674585 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="sg-core" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.674592 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="sg-core" Jun 13 05:06:05 crc kubenswrapper[4894]: E0613 05:06:05.674601 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-central-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.674610 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-central-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: E0613 05:06:05.674630 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" containerName="container-00" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.674635 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" containerName="container-00" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.675006 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-central-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.675049 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" containerName="container-00" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.675060 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="ceilometer-notification-agent" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.675083 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="proxy-httpd" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.675110 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" containerName="sg-core" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.680112 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.685729 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.685996 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.703722 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.755945 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756040 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756058 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756096 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756116 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756149 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.756187 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7r5\" (UniqueName: \"kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.858071 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.858639 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.859379 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.859474 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.858603 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.859735 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.859836 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7r5\" (UniqueName: \"kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.859922 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.860668 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.863508 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.868197 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.869480 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.870019 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.881408 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7r5\" (UniqueName: \"kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5\") pod \"ceilometer-0\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " pod="openstack/ceilometer-0" Jun 13 05:06:05 crc kubenswrapper[4894]: I0613 05:06:05.998940 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:06 crc kubenswrapper[4894]: I0613 05:06:06.286483 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b49d780-448b-4d39-be8c-4d711a72c12f" path="/var/lib/kubelet/pods/6b49d780-448b-4d39-be8c-4d711a72c12f/volumes" Jun 13 05:06:06 crc kubenswrapper[4894]: I0613 05:06:06.287467 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfbbd42-1ddb-4298-82ce-6a40eed330e6" path="/var/lib/kubelet/pods/6bfbbd42-1ddb-4298-82ce-6a40eed330e6/volumes" Jun 13 05:06:06 crc kubenswrapper[4894]: I0613 05:06:06.482350 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:06 crc kubenswrapper[4894]: I0613 05:06:06.547254 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerStarted","Data":"2b1792ee45994154f61e29cd3b8c165f05e64bab0756584e8a92dd9571b4a14d"} Jun 13 05:06:06 crc kubenswrapper[4894]: I0613 05:06:06.547704 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:07 crc kubenswrapper[4894]: I0613 05:06:07.555697 4894 generic.go:334] "Generic (PLEG): container finished" podID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" containerID="e9a6e357fe34bfe6ad66fd02bbb1d82712e7ecc6779201673c409dac2488e220" exitCode=0 Jun 13 05:06:07 crc kubenswrapper[4894]: I0613 05:06:07.557029 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-czlhd" event={"ID":"e8402629-c5ed-4482-9a1c-bdf5caaa2a21","Type":"ContainerDied","Data":"e9a6e357fe34bfe6ad66fd02bbb1d82712e7ecc6779201673c409dac2488e220"} Jun 13 05:06:07 crc kubenswrapper[4894]: I0613 05:06:07.560237 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerStarted","Data":"3ef86c4e7dc54c76893d68b6748c30cab746ad13daacef2073150d457fc6564d"} Jun 13 05:06:08 crc kubenswrapper[4894]: I0613 05:06:08.575921 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerStarted","Data":"a70ee4afda78212584d12263582f60ff42bce6bcea0b9c96f43e6ae6c7a060f0"} Jun 13 05:06:08 crc kubenswrapper[4894]: I0613 05:06:08.944367 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-czlhd" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.014921 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.014996 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015038 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015060 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015099 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015127 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015183 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkbkw\" (UniqueName: \"kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw\") pod \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\" (UID: \"e8402629-c5ed-4482-9a1c-bdf5caaa2a21\") " Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.015751 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.016325 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.020772 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw" (OuterVolumeSpecName: "kube-api-access-wkbkw") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "kube-api-access-wkbkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.021431 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts" (OuterVolumeSpecName: "scripts") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.022917 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.038900 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.067426 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data" (OuterVolumeSpecName: "config-data") pod "e8402629-c5ed-4482-9a1c-bdf5caaa2a21" (UID: "e8402629-c5ed-4482-9a1c-bdf5caaa2a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117623 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117738 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117796 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117860 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117911 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.117960 4894 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.118008 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkbkw\" (UniqueName: \"kubernetes.io/projected/e8402629-c5ed-4482-9a1c-bdf5caaa2a21-kube-api-access-wkbkw\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.592380 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerStarted","Data":"a5d11207e4e9029d30acf4d6e3b0f8472469a9570958b96997c028f8fbf03c0e"} Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.605283 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-czlhd" event={"ID":"e8402629-c5ed-4482-9a1c-bdf5caaa2a21","Type":"ContainerDied","Data":"e29e9087928322668705115ae8633e2ef60e12262342c2cf0793a08eef6417c5"} Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.605343 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e29e9087928322668705115ae8633e2ef60e12262342c2cf0793a08eef6417c5" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.605441 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-czlhd" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.916794 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:09 crc kubenswrapper[4894]: E0613 05:06:09.917093 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" containerName="cinder-db-sync" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.917109 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" containerName="cinder-db-sync" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.917272 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" containerName="cinder-db-sync" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.918071 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.927502 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.927563 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bwdvs" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.927689 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.929006 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jun 13 05:06:09 crc kubenswrapper[4894]: I0613 05:06:09.953817 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.000021 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.000856 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="dnsmasq-dns" containerID="cri-o://a74216222ee96808ce987f7f33144afb0cc514dafcaad779c3e0f9bcaa5ac3c8" gracePeriod=10 Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.006555 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.031078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.031119 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.031140 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcfj\" (UniqueName: \"kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.031271 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.031427 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.041669 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.041772 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.069723 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.071105 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.081714 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.145847 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.145894 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.145928 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnq29\" (UniqueName: \"kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.145956 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146000 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146032 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146087 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146134 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146154 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146172 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146188 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcfj\" (UniqueName: \"kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.146551 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.151812 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.175419 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.176189 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcfj\" (UniqueName: \"kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.181404 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.181476 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.182152 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.212491 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.213883 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.217021 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.236428 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.240318 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.247560 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.247622 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.247665 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.247704 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.247733 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnq29\" (UniqueName: \"kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.248696 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.248711 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.248756 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.253321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.254020 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.268697 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnq29\" (UniqueName: \"kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29\") pod \"dnsmasq-dns-75cc475fb9-tpsp4\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.353866 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.353936 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkpj\" (UniqueName: \"kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.353952 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.353967 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.354002 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.354078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.354129 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.354184 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.393990 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457697 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457745 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkpj\" (UniqueName: \"kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457768 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457785 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457820 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457873 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457908 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.457947 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.464229 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.466994 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.467215 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.467253 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.467281 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.474443 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.477809 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.488178 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkpj\" (UniqueName: \"kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj\") pod \"cinder-api-0\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.596060 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.627201 4894 generic.go:334] "Generic (PLEG): container finished" podID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerID="a74216222ee96808ce987f7f33144afb0cc514dafcaad779c3e0f9bcaa5ac3c8" exitCode=0 Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.627256 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" event={"ID":"4ca8d069-f049-48a0-baa9-12828cb5a77e","Type":"ContainerDied","Data":"a74216222ee96808ce987f7f33144afb0cc514dafcaad779c3e0f9bcaa5ac3c8"} Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.628745 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerStarted","Data":"716bbe58e254ce28d6d21522f8535b9b8f8d96c3257fa7d4d2a42e6e15e81ef8"} Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.629684 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.662564 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.51485974 podStartE2EDuration="5.655981847s" podCreationTimestamp="2025-06-13 05:06:05 +0000 UTC" firstStartedPulling="2025-06-13 05:06:06.496564695 +0000 UTC m=+924.942812158" lastFinishedPulling="2025-06-13 05:06:09.637686762 +0000 UTC m=+928.083934265" observedRunningTime="2025-06-13 05:06:10.654338981 +0000 UTC m=+929.100586444" watchObservedRunningTime="2025-06-13 05:06:10.655981847 +0000 UTC m=+929.102229310" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.701178 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Jun 13 05:06:10 crc kubenswrapper[4894]: I0613 05:06:10.881696 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.088030 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.216087 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.393027 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.401607 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc\") pod \"4ca8d069-f049-48a0-baa9-12828cb5a77e\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.406475 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb\") pod \"4ca8d069-f049-48a0-baa9-12828cb5a77e\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.411318 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb\") pod \"4ca8d069-f049-48a0-baa9-12828cb5a77e\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.411361 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mx6f\" (UniqueName: \"kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f\") pod \"4ca8d069-f049-48a0-baa9-12828cb5a77e\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.411425 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config\") pod \"4ca8d069-f049-48a0-baa9-12828cb5a77e\" (UID: \"4ca8d069-f049-48a0-baa9-12828cb5a77e\") " Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.468876 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f" (OuterVolumeSpecName: "kube-api-access-7mx6f") pod "4ca8d069-f049-48a0-baa9-12828cb5a77e" (UID: "4ca8d069-f049-48a0-baa9-12828cb5a77e"). InnerVolumeSpecName "kube-api-access-7mx6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.529995 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mx6f\" (UniqueName: \"kubernetes.io/projected/4ca8d069-f049-48a0-baa9-12828cb5a77e-kube-api-access-7mx6f\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.568322 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ca8d069-f049-48a0-baa9-12828cb5a77e" (UID: "4ca8d069-f049-48a0-baa9-12828cb5a77e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.595202 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ca8d069-f049-48a0-baa9-12828cb5a77e" (UID: "4ca8d069-f049-48a0-baa9-12828cb5a77e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.595934 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config" (OuterVolumeSpecName: "config") pod "4ca8d069-f049-48a0-baa9-12828cb5a77e" (UID: "4ca8d069-f049-48a0-baa9-12828cb5a77e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.604392 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ca8d069-f049-48a0-baa9-12828cb5a77e" (UID: "4ca8d069-f049-48a0-baa9-12828cb5a77e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.635623 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.635683 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.635696 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.635706 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca8d069-f049-48a0-baa9-12828cb5a77e-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.681760 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerStarted","Data":"a03145815541265e5cfc3ad3d7b2fde802c4283ea0ea52d57689cb5de1881fc7"} Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.688402 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" event={"ID":"4ca8d069-f049-48a0-baa9-12828cb5a77e","Type":"ContainerDied","Data":"7f297c2daab27f445d2058b2c2c0d72782ce67c486320380f981c68e29136030"} Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.688449 4894 scope.go:117] "RemoveContainer" containerID="a74216222ee96808ce987f7f33144afb0cc514dafcaad779c3e0f9bcaa5ac3c8" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.688629 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbb4b98fc-7zr69" Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.696061 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" event={"ID":"9f9b589d-4cbe-4858-986e-f7f9ec698137","Type":"ContainerStarted","Data":"e328848ab9a325aa86f462cf2e4fc0db69228931089734f98de02fb386dc665a"} Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.720869 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-central-agent" containerID="cri-o://3ef86c4e7dc54c76893d68b6748c30cab746ad13daacef2073150d457fc6564d" gracePeriod=30 Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.721867 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerStarted","Data":"41aa8524897e7da5c6d3c82c3a96f300c8bd7364875311c727e5127a5323becd"} Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.721945 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="proxy-httpd" containerID="cri-o://716bbe58e254ce28d6d21522f8535b9b8f8d96c3257fa7d4d2a42e6e15e81ef8" gracePeriod=30 Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.722347 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-notification-agent" containerID="cri-o://a70ee4afda78212584d12263582f60ff42bce6bcea0b9c96f43e6ae6c7a060f0" gracePeriod=30 Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.722419 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="sg-core" containerID="cri-o://a5d11207e4e9029d30acf4d6e3b0f8472469a9570958b96997c028f8fbf03c0e" gracePeriod=30 Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.787139 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:11 crc kubenswrapper[4894]: I0613 05:06:11.801886 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dbb4b98fc-7zr69"] Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.031330 4894 scope.go:117] "RemoveContainer" containerID="54161dd96655ee1110a683146b5ecc9d59f898e6401fc088d3822827ce58b197" Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.297453 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" path="/var/lib/kubelet/pods/4ca8d069-f049-48a0-baa9-12828cb5a77e/volumes" Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.605288 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.812889 4894 generic.go:334] "Generic (PLEG): container finished" podID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerID="9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85" exitCode=0 Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.813490 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" event={"ID":"9f9b589d-4cbe-4858-986e-f7f9ec698137","Type":"ContainerDied","Data":"9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890413 4894 generic.go:334] "Generic (PLEG): container finished" podID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerID="716bbe58e254ce28d6d21522f8535b9b8f8d96c3257fa7d4d2a42e6e15e81ef8" exitCode=0 Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890451 4894 generic.go:334] "Generic (PLEG): container finished" podID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerID="a5d11207e4e9029d30acf4d6e3b0f8472469a9570958b96997c028f8fbf03c0e" exitCode=2 Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890458 4894 generic.go:334] "Generic (PLEG): container finished" podID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerID="a70ee4afda78212584d12263582f60ff42bce6bcea0b9c96f43e6ae6c7a060f0" exitCode=0 Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890466 4894 generic.go:334] "Generic (PLEG): container finished" podID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerID="3ef86c4e7dc54c76893d68b6748c30cab746ad13daacef2073150d457fc6564d" exitCode=0 Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890569 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerDied","Data":"716bbe58e254ce28d6d21522f8535b9b8f8d96c3257fa7d4d2a42e6e15e81ef8"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890604 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerDied","Data":"a5d11207e4e9029d30acf4d6e3b0f8472469a9570958b96997c028f8fbf03c0e"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890615 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerDied","Data":"a70ee4afda78212584d12263582f60ff42bce6bcea0b9c96f43e6ae6c7a060f0"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.890624 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerDied","Data":"3ef86c4e7dc54c76893d68b6748c30cab746ad13daacef2073150d457fc6564d"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.906266 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerStarted","Data":"8cf832ccdc83dc8f468ced5b0ec2ad3a64f4e279efc56d39d8ddb184fa151697"} Jun 13 05:06:12 crc kubenswrapper[4894]: I0613 05:06:12.984866 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.066957 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067004 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067053 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067103 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067176 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067260 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b7r5\" (UniqueName: \"kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067283 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd\") pod \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\" (UID: \"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178\") " Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.067999 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.068020 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.086143 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.100586 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5" (OuterVolumeSpecName: "kube-api-access-8b7r5") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "kube-api-access-8b7r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.107512 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts" (OuterVolumeSpecName: "scripts") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.168559 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b7r5\" (UniqueName: \"kubernetes.io/projected/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-kube-api-access-8b7r5\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.168583 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.168592 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.168601 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.208574 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.278304 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.325277 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.379852 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.410682 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data" (OuterVolumeSpecName: "config-data") pod "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" (UID: "2a5cf0b8-18b2-4d5c-a208-96c0b26d3178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.481631 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.639039 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.920189 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerStarted","Data":"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50"} Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.927159 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" event={"ID":"9f9b589d-4cbe-4858-986e-f7f9ec698137","Type":"ContainerStarted","Data":"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e"} Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.927515 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.935803 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a5cf0b8-18b2-4d5c-a208-96c0b26d3178","Type":"ContainerDied","Data":"2b1792ee45994154f61e29cd3b8c165f05e64bab0756584e8a92dd9571b4a14d"} Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.935884 4894 scope.go:117] "RemoveContainer" containerID="716bbe58e254ce28d6d21522f8535b9b8f8d96c3257fa7d4d2a42e6e15e81ef8" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.935822 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.960061 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerStarted","Data":"f4e31b8d991b91ac265f1d86f3f292a8840f1bcc159aab333d76522cc161bfda"} Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.961124 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api-log" containerID="cri-o://8cf832ccdc83dc8f468ced5b0ec2ad3a64f4e279efc56d39d8ddb184fa151697" gracePeriod=30 Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.961269 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.961313 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api" containerID="cri-o://f4e31b8d991b91ac265f1d86f3f292a8840f1bcc159aab333d76522cc161bfda" gracePeriod=30 Jun 13 05:06:13 crc kubenswrapper[4894]: I0613 05:06:13.980202 4894 scope.go:117] "RemoveContainer" containerID="a5d11207e4e9029d30acf4d6e3b0f8472469a9570958b96997c028f8fbf03c0e" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.057714 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" podStartSLOduration=4.057696292 podStartE2EDuration="4.057696292s" podCreationTimestamp="2025-06-13 05:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:13.960474104 +0000 UTC m=+932.406721567" watchObservedRunningTime="2025-06-13 05:06:14.057696292 +0000 UTC m=+932.503943755" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.074823 4894 scope.go:117] "RemoveContainer" containerID="a70ee4afda78212584d12263582f60ff42bce6bcea0b9c96f43e6ae6c7a060f0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.082725 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.111900 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.121644 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122049 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="sg-core" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122068 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="sg-core" Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122082 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="init" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122096 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="init" Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122107 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-central-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122114 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-central-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122139 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="proxy-httpd" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122145 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="proxy-httpd" Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122166 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-notification-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122172 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-notification-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: E0613 05:06:14.122185 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="dnsmasq-dns" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122191 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="dnsmasq-dns" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122346 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="sg-core" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122364 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="proxy-httpd" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122372 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-central-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122381 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" containerName="ceilometer-notification-agent" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.122390 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca8d069-f049-48a0-baa9-12828cb5a77e" containerName="dnsmasq-dns" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.123998 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.126743 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.126726176 podStartE2EDuration="4.126726176s" podCreationTimestamp="2025-06-13 05:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:14.010680648 +0000 UTC m=+932.456928111" watchObservedRunningTime="2025-06-13 05:06:14.126726176 +0000 UTC m=+932.572973639" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.130302 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.130528 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.134317 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.171728 4894 scope.go:117] "RemoveContainer" containerID="3ef86c4e7dc54c76893d68b6748c30cab746ad13daacef2073150d457fc6564d" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229218 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784m6\" (UniqueName: \"kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229259 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229297 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229337 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229370 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229418 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.229441 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.289023 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5cf0b8-18b2-4d5c-a208-96c0b26d3178" path="/var/lib/kubelet/pods/2a5cf0b8-18b2-4d5c-a208-96c0b26d3178/volumes" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.330644 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.330898 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331019 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784m6\" (UniqueName: \"kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331097 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331231 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331049 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331460 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331540 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.331619 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.337307 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.337355 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.342045 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.342140 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.357146 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784m6\" (UniqueName: \"kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6\") pod \"ceilometer-0\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.479619 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.584872 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.978592 4894 generic.go:334] "Generic (PLEG): container finished" podID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerID="8cf832ccdc83dc8f468ced5b0ec2ad3a64f4e279efc56d39d8ddb184fa151697" exitCode=143 Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.978914 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerDied","Data":"8cf832ccdc83dc8f468ced5b0ec2ad3a64f4e279efc56d39d8ddb184fa151697"} Jun 13 05:06:14 crc kubenswrapper[4894]: I0613 05:06:14.982782 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerStarted","Data":"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894"} Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.011636 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.871863918 podStartE2EDuration="6.011499532s" podCreationTimestamp="2025-06-13 05:06:09 +0000 UTC" firstStartedPulling="2025-06-13 05:06:10.900324508 +0000 UTC m=+929.346571971" lastFinishedPulling="2025-06-13 05:06:12.039960122 +0000 UTC m=+930.486207585" observedRunningTime="2025-06-13 05:06:15.004380502 +0000 UTC m=+933.450627955" watchObservedRunningTime="2025-06-13 05:06:15.011499532 +0000 UTC m=+933.457746995" Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.031211 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.195168 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-554d559d55-rtnwg" Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.241603 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.275736 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.276076 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-685f75758b-gbvvw" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-api" containerID="cri-o://4de1d4603802db591a776d4762fd23358d4b55e965d40595af3aa363a9e3cc11" gracePeriod=30 Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.276693 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-685f75758b-gbvvw" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-httpd" containerID="cri-o://970793b09107d578520bb735bb37f72b83e1a346984ff9caaaf079a70eab3137" gracePeriod=30 Jun 13 05:06:15 crc kubenswrapper[4894]: I0613 05:06:15.977092 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.015279 4894 generic.go:334] "Generic (PLEG): container finished" podID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerID="f4e31b8d991b91ac265f1d86f3f292a8840f1bcc159aab333d76522cc161bfda" exitCode=0 Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.015348 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerDied","Data":"f4e31b8d991b91ac265f1d86f3f292a8840f1bcc159aab333d76522cc161bfda"} Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.015374 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60a06b7e-b832-4bf3-a368-0d4b3092c05b","Type":"ContainerDied","Data":"41aa8524897e7da5c6d3c82c3a96f300c8bd7364875311c727e5127a5323becd"} Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.015384 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41aa8524897e7da5c6d3c82c3a96f300c8bd7364875311c727e5127a5323becd" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.019902 4894 generic.go:334] "Generic (PLEG): container finished" podID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerID="970793b09107d578520bb735bb37f72b83e1a346984ff9caaaf079a70eab3137" exitCode=0 Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.019943 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerDied","Data":"970793b09107d578520bb735bb37f72b83e1a346984ff9caaaf079a70eab3137"} Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.021615 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerStarted","Data":"e4f55b17140f51f71720836ec750eb9a0b10dae8b71fcb387e54d93c480cf674"} Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.021631 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerStarted","Data":"7522228f7c867ba930983101326deb55cf29e8dcd91e0e8ebfb50221dbad1e55"} Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.045848 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.053217 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bb5c5d86b-bl6pt" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.135330 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.135525 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b4c54c78-v656z" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" containerID="cri-o://83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4" gracePeriod=30 Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.146280 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b4c54c78-v656z" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api" containerID="cri-o://db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c" gracePeriod=30 Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.153427 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-84b4c54c78-v656z" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": EOF" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.173519 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.173702 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkpj\" (UniqueName: \"kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.173867 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174042 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174178 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174306 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174404 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174829 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.174957 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data\") pod \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\" (UID: \"60a06b7e-b832-4bf3-a368-0d4b3092c05b\") " Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.175441 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.181178 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.182027 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs" (OuterVolumeSpecName: "logs") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.204061 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj" (OuterVolumeSpecName: "kube-api-access-jxkpj") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "kube-api-access-jxkpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.207773 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts" (OuterVolumeSpecName: "scripts") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.207866 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.261514 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.282966 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a06b7e-b832-4bf3-a368-0d4b3092c05b-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.282995 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a06b7e-b832-4bf3-a368-0d4b3092c05b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.283006 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.283015 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkpj\" (UniqueName: \"kubernetes.io/projected/60a06b7e-b832-4bf3-a368-0d4b3092c05b-kube-api-access-jxkpj\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.283023 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.283031 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.308549 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data" (OuterVolumeSpecName: "config-data") pod "60a06b7e-b832-4bf3-a368-0d4b3092c05b" (UID: "60a06b7e-b832-4bf3-a368-0d4b3092c05b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:16 crc kubenswrapper[4894]: I0613 05:06:16.384778 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a06b7e-b832-4bf3-a368-0d4b3092c05b-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.048584 4894 generic.go:334] "Generic (PLEG): container finished" podID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerID="83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4" exitCode=143 Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.048903 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerDied","Data":"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4"} Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.067860 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerStarted","Data":"a6158c773d1065c3d665963aed74d5b95d30f4bd6a37163714ffd4c22598d33f"} Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.071332 4894 generic.go:334] "Generic (PLEG): container finished" podID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerID="4de1d4603802db591a776d4762fd23358d4b55e965d40595af3aa363a9e3cc11" exitCode=0 Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.071442 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.072169 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerDied","Data":"4de1d4603802db591a776d4762fd23358d4b55e965d40595af3aa363a9e3cc11"} Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.107815 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.113596 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.125049 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:17 crc kubenswrapper[4894]: E0613 05:06:17.125386 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api-log" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.125402 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api-log" Jun 13 05:06:17 crc kubenswrapper[4894]: E0613 05:06:17.125428 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.125434 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.125573 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api-log" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.125588 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" containerName="cinder-api" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.126411 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.131917 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.132074 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.133424 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.134260 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.304353 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7nt\" (UniqueName: \"kubernetes.io/projected/2bea4009-3767-4a17-8687-18008e27effd-kube-api-access-kl7nt\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.304599 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bea4009-3767-4a17-8687-18008e27effd-logs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.304643 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.304685 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-scripts\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309705 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309765 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309848 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-localtime\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309873 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309935 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.309994 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411427 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411477 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411526 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7nt\" (UniqueName: \"kubernetes.io/projected/2bea4009-3767-4a17-8687-18008e27effd-kube-api-access-kl7nt\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411573 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bea4009-3767-4a17-8687-18008e27effd-logs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411608 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411627 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-scripts\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411646 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411725 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411750 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-localtime\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.411767 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.412289 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.418791 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bea4009-3767-4a17-8687-18008e27effd-logs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.419144 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/2bea4009-3767-4a17-8687-18008e27effd-etc-localtime\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.419670 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.427685 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.428023 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.433616 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.434058 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-config-data\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.434261 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bea4009-3767-4a17-8687-18008e27effd-scripts\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.443257 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7nt\" (UniqueName: \"kubernetes.io/projected/2bea4009-3767-4a17-8687-18008e27effd-kube-api-access-kl7nt\") pod \"cinder-api-0\" (UID: \"2bea4009-3767-4a17-8687-18008e27effd\") " pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.448171 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.521194 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.625388 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhvhc\" (UniqueName: \"kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc\") pod \"60b704f8-037e-4d44-a89d-d8a7ae539b13\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.625490 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config\") pod \"60b704f8-037e-4d44-a89d-d8a7ae539b13\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.625519 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle\") pod \"60b704f8-037e-4d44-a89d-d8a7ae539b13\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.625613 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs\") pod \"60b704f8-037e-4d44-a89d-d8a7ae539b13\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.625631 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config\") pod \"60b704f8-037e-4d44-a89d-d8a7ae539b13\" (UID: \"60b704f8-037e-4d44-a89d-d8a7ae539b13\") " Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.643547 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc" (OuterVolumeSpecName: "kube-api-access-zhvhc") pod "60b704f8-037e-4d44-a89d-d8a7ae539b13" (UID: "60b704f8-037e-4d44-a89d-d8a7ae539b13"). InnerVolumeSpecName "kube-api-access-zhvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.653605 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "60b704f8-037e-4d44-a89d-d8a7ae539b13" (UID: "60b704f8-037e-4d44-a89d-d8a7ae539b13"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.727611 4894 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-httpd-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.728055 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhvhc\" (UniqueName: \"kubernetes.io/projected/60b704f8-037e-4d44-a89d-d8a7ae539b13-kube-api-access-zhvhc\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.735781 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "60b704f8-037e-4d44-a89d-d8a7ae539b13" (UID: "60b704f8-037e-4d44-a89d-d8a7ae539b13"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.746865 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config" (OuterVolumeSpecName: "config") pod "60b704f8-037e-4d44-a89d-d8a7ae539b13" (UID: "60b704f8-037e-4d44-a89d-d8a7ae539b13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.770785 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b704f8-037e-4d44-a89d-d8a7ae539b13" (UID: "60b704f8-037e-4d44-a89d-d8a7ae539b13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.829184 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.829222 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:17 crc kubenswrapper[4894]: I0613 05:06:17.829233 4894 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b704f8-037e-4d44-a89d-d8a7ae539b13-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.004312 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.091734 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bea4009-3767-4a17-8687-18008e27effd","Type":"ContainerStarted","Data":"ef2bd20878bee1a3df5d4ee58e5adbb5930a7edb18eb72a49379ecb7f69d9201"} Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.103408 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685f75758b-gbvvw" event={"ID":"60b704f8-037e-4d44-a89d-d8a7ae539b13","Type":"ContainerDied","Data":"560839a5b0de45eb04d6a09df10142d0d510a0c49a69d6c4fca6f7ce68c3e61e"} Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.103455 4894 scope.go:117] "RemoveContainer" containerID="970793b09107d578520bb735bb37f72b83e1a346984ff9caaaf079a70eab3137" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.103574 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685f75758b-gbvvw" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.116409 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerStarted","Data":"ce0f8bbb8df896b5567cb3b6b22ab6a17b0e59b89dd268ca89bf58789e8ed85c"} Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.137730 4894 scope.go:117] "RemoveContainer" containerID="4de1d4603802db591a776d4762fd23358d4b55e965d40595af3aa363a9e3cc11" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.145752 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.150464 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-685f75758b-gbvvw"] Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.287073 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a06b7e-b832-4bf3-a368-0d4b3092c05b" path="/var/lib/kubelet/pods/60a06b7e-b832-4bf3-a368-0d4b3092c05b/volumes" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.288520 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" path="/var/lib/kubelet/pods/60b704f8-037e-4d44-a89d-d8a7ae539b13/volumes" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.819230 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:06:18 crc kubenswrapper[4894]: I0613 05:06:18.931149 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56495d6d8b-4hpz7" Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.136462 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerStarted","Data":"a845cdb91e73eb0a217314b189863df1d78dfd7ac49b517932f6ca0e32194787"} Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.136596 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.151954 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bea4009-3767-4a17-8687-18008e27effd","Type":"ContainerStarted","Data":"098b5d9588c0f4cc20a740a66878ebb97066dc2e710d36f5d4312a81595c52f1"} Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.174997 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.673984139 podStartE2EDuration="6.174974079s" podCreationTimestamp="2025-06-13 05:06:13 +0000 UTC" firstStartedPulling="2025-06-13 05:06:15.04336455 +0000 UTC m=+933.489612013" lastFinishedPulling="2025-06-13 05:06:18.54435449 +0000 UTC m=+936.990601953" observedRunningTime="2025-06-13 05:06:19.163692202 +0000 UTC m=+937.609939665" watchObservedRunningTime="2025-06-13 05:06:19.174974079 +0000 UTC m=+937.621221542" Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.247150 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:19 crc kubenswrapper[4894]: I0613 05:06:19.939354 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6bfbcbb6c7-q4p46" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.161203 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bea4009-3767-4a17-8687-18008e27effd","Type":"ContainerStarted","Data":"0a19faa20bf2f6f57e06af97298dfce81fc8d7dc8485e9bb9623eba7031c04f6"} Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.161357 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.395773 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.416883 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.416866582 podStartE2EDuration="3.416866582s" podCreationTimestamp="2025-06-13 05:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:20.206757215 +0000 UTC m=+938.653004678" watchObservedRunningTime="2025-06-13 05:06:20.416866582 +0000 UTC m=+938.863114045" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.444989 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.445200 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b59768489-skrdq" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="dnsmasq-dns" containerID="cri-o://d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5" gracePeriod=10 Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.588887 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.628337 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.737368 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84b4c54c78-v656z" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:53354->10.217.0.147:9311: read: connection reset by peer" Jun 13 05:06:20 crc kubenswrapper[4894]: I0613 05:06:20.737466 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84b4c54c78-v656z" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:53370->10.217.0.147:9311: read: connection reset by peer" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.003867 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.106014 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc\") pod \"2968fe1b-db73-45f4-974f-cab909b022f3\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.106144 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config\") pod \"2968fe1b-db73-45f4-974f-cab909b022f3\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.106202 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb\") pod \"2968fe1b-db73-45f4-974f-cab909b022f3\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.106302 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb\") pod \"2968fe1b-db73-45f4-974f-cab909b022f3\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.106324 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g8jb\" (UniqueName: \"kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb\") pod \"2968fe1b-db73-45f4-974f-cab909b022f3\" (UID: \"2968fe1b-db73-45f4-974f-cab909b022f3\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.126573 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb" (OuterVolumeSpecName: "kube-api-access-2g8jb") pod "2968fe1b-db73-45f4-974f-cab909b022f3" (UID: "2968fe1b-db73-45f4-974f-cab909b022f3"). InnerVolumeSpecName "kube-api-access-2g8jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.178626 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2968fe1b-db73-45f4-974f-cab909b022f3" (UID: "2968fe1b-db73-45f4-974f-cab909b022f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.209320 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.209351 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g8jb\" (UniqueName: \"kubernetes.io/projected/2968fe1b-db73-45f4-974f-cab909b022f3-kube-api-access-2g8jb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.213088 4894 generic.go:334] "Generic (PLEG): container finished" podID="2968fe1b-db73-45f4-974f-cab909b022f3" containerID="d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5" exitCode=0 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.213147 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b59768489-skrdq" event={"ID":"2968fe1b-db73-45f4-974f-cab909b022f3","Type":"ContainerDied","Data":"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5"} Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.213175 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b59768489-skrdq" event={"ID":"2968fe1b-db73-45f4-974f-cab909b022f3","Type":"ContainerDied","Data":"a20ce62aac6e55da936f1b1e775ae543b84d40556ec57a1f1aaef1be17f54aeb"} Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.213202 4894 scope.go:117] "RemoveContainer" containerID="d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.213307 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b59768489-skrdq" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.215768 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.219273 4894 generic.go:334] "Generic (PLEG): container finished" podID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerID="db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c" exitCode=0 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220255 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerDied","Data":"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c"} Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220286 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b4c54c78-v656z" event={"ID":"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a","Type":"ContainerDied","Data":"989d0f7cbe4caadc36cd6b78c9b59d5052af19b80d54a17ddad6aec565ea2334"} Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220436 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-central-agent" containerID="cri-o://e4f55b17140f51f71720836ec750eb9a0b10dae8b71fcb387e54d93c480cf674" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220614 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="cinder-scheduler" containerID="cri-o://2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220717 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="proxy-httpd" containerID="cri-o://a845cdb91e73eb0a217314b189863df1d78dfd7ac49b517932f6ca0e32194787" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220831 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-notification-agent" containerID="cri-o://a6158c773d1065c3d665963aed74d5b95d30f4bd6a37163714ffd4c22598d33f" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.221038 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="probe" containerID="cri-o://72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.223639 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2968fe1b-db73-45f4-974f-cab909b022f3" (UID: "2968fe1b-db73-45f4-974f-cab909b022f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.220768 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="sg-core" containerID="cri-o://ce0f8bbb8df896b5567cb3b6b22ab6a17b0e59b89dd268ca89bf58789e8ed85c" gracePeriod=30 Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.271523 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config" (OuterVolumeSpecName: "config") pod "2968fe1b-db73-45f4-974f-cab909b022f3" (UID: "2968fe1b-db73-45f4-974f-cab909b022f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.280567 4894 scope.go:117] "RemoveContainer" containerID="6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.281994 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2968fe1b-db73-45f4-974f-cab909b022f3" (UID: "2968fe1b-db73-45f4-974f-cab909b022f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.308987 4894 scope.go:117] "RemoveContainer" containerID="d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.311297 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rmjq\" (UniqueName: \"kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq\") pod \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.311365 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data\") pod \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.311461 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom\") pod \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.311511 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs\") pod \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.311557 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle\") pod \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\" (UID: \"9e24fd1b-12d2-475d-94b8-1cae51b9fd7a\") " Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.312085 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.312096 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.312105 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968fe1b-db73-45f4-974f-cab909b022f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: E0613 05:06:21.312234 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5\": container with ID starting with d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5 not found: ID does not exist" containerID="d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.312265 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5"} err="failed to get container status \"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5\": rpc error: code = NotFound desc = could not find container \"d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5\": container with ID starting with d821b6c955767199653e573989d8d242bfb15643795070dfeca12fd3ac3d6bd5 not found: ID does not exist" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.312284 4894 scope.go:117] "RemoveContainer" containerID="6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae" Jun 13 05:06:21 crc kubenswrapper[4894]: E0613 05:06:21.313316 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae\": container with ID starting with 6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae not found: ID does not exist" containerID="6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.313410 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae"} err="failed to get container status \"6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae\": rpc error: code = NotFound desc = could not find container \"6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae\": container with ID starting with 6d50fd13c650ea23d64fa4ace1eb23bb590b9a151e8bb188fd42949ce6e45dae not found: ID does not exist" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.313485 4894 scope.go:117] "RemoveContainer" containerID="db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.316273 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs" (OuterVolumeSpecName: "logs") pod "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" (UID: "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.317935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" (UID: "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.318361 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq" (OuterVolumeSpecName: "kube-api-access-7rmjq") pod "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" (UID: "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a"). InnerVolumeSpecName "kube-api-access-7rmjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.350040 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" (UID: "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.356875 4894 scope.go:117] "RemoveContainer" containerID="83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.377586 4894 scope.go:117] "RemoveContainer" containerID="db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c" Jun 13 05:06:21 crc kubenswrapper[4894]: E0613 05:06:21.378051 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c\": container with ID starting with db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c not found: ID does not exist" containerID="db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.378083 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c"} err="failed to get container status \"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c\": rpc error: code = NotFound desc = could not find container \"db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c\": container with ID starting with db294e6dd11502a5e2959179fe1b3d57690cfa5bbf0f322faf249c50521aca9c not found: ID does not exist" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.378105 4894 scope.go:117] "RemoveContainer" containerID="83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4" Jun 13 05:06:21 crc kubenswrapper[4894]: E0613 05:06:21.378365 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4\": container with ID starting with 83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4 not found: ID does not exist" containerID="83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.378409 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4"} err="failed to get container status \"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4\": rpc error: code = NotFound desc = could not find container \"83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4\": container with ID starting with 83751ce1f028f63ac80a01b694e0fc3823ab76c582060bae986b5038326e20e4 not found: ID does not exist" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.381211 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data" (OuterVolumeSpecName: "config-data") pod "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" (UID: "9e24fd1b-12d2-475d-94b8-1cae51b9fd7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.414471 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rmjq\" (UniqueName: \"kubernetes.io/projected/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-kube-api-access-7rmjq\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.416713 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.416735 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.416752 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.416773 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.565375 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:06:21 crc kubenswrapper[4894]: I0613 05:06:21.578137 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b59768489-skrdq"] Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.234495 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b4c54c78-v656z" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257124 4894 generic.go:334] "Generic (PLEG): container finished" podID="a5207fd3-b02c-4294-a990-75e9356334b5" containerID="a845cdb91e73eb0a217314b189863df1d78dfd7ac49b517932f6ca0e32194787" exitCode=0 Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257150 4894 generic.go:334] "Generic (PLEG): container finished" podID="a5207fd3-b02c-4294-a990-75e9356334b5" containerID="ce0f8bbb8df896b5567cb3b6b22ab6a17b0e59b89dd268ca89bf58789e8ed85c" exitCode=2 Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257158 4894 generic.go:334] "Generic (PLEG): container finished" podID="a5207fd3-b02c-4294-a990-75e9356334b5" containerID="a6158c773d1065c3d665963aed74d5b95d30f4bd6a37163714ffd4c22598d33f" exitCode=0 Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257191 4894 generic.go:334] "Generic (PLEG): container finished" podID="a5207fd3-b02c-4294-a990-75e9356334b5" containerID="e4f55b17140f51f71720836ec750eb9a0b10dae8b71fcb387e54d93c480cf674" exitCode=0 Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257230 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerDied","Data":"a845cdb91e73eb0a217314b189863df1d78dfd7ac49b517932f6ca0e32194787"} Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257276 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerDied","Data":"ce0f8bbb8df896b5567cb3b6b22ab6a17b0e59b89dd268ca89bf58789e8ed85c"} Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257287 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerDied","Data":"a6158c773d1065c3d665963aed74d5b95d30f4bd6a37163714ffd4c22598d33f"} Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.257302 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerDied","Data":"e4f55b17140f51f71720836ec750eb9a0b10dae8b71fcb387e54d93c480cf674"} Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.260554 4894 generic.go:334] "Generic (PLEG): container finished" podID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerID="72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894" exitCode=0 Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.260578 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerDied","Data":"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894"} Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.307537 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" path="/var/lib/kubelet/pods/2968fe1b-db73-45f4-974f-cab909b022f3/volumes" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.308198 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.308226 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84b4c54c78-v656z"] Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.349240 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431315 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431668 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431692 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431784 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431834 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431875 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784m6\" (UniqueName: \"kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.431899 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data\") pod \"a5207fd3-b02c-4294-a990-75e9356334b5\" (UID: \"a5207fd3-b02c-4294-a990-75e9356334b5\") " Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.432454 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.432891 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.440472 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6" (OuterVolumeSpecName: "kube-api-access-784m6") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "kube-api-access-784m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.443249 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts" (OuterVolumeSpecName: "scripts") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455321 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455727 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455744 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455759 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="sg-core" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455765 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="sg-core" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455773 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-api" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455796 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-api" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455837 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-notification-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455843 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-notification-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455855 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455861 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455871 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-central-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455877 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-central-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455886 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="proxy-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455893 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="proxy-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455900 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="dnsmasq-dns" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455907 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="dnsmasq-dns" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455924 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="init" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455930 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="init" Jun 13 05:06:22 crc kubenswrapper[4894]: E0613 05:06:22.455940 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.455947 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456099 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456112 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456119 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="sg-core" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456126 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-notification-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456133 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="proxy-httpd" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456146 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" containerName="ceilometer-central-agent" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456155 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" containerName="barbican-api-log" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456168 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2968fe1b-db73-45f4-974f-cab909b022f3" containerName="dnsmasq-dns" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.456177 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b704f8-037e-4d44-a89d-d8a7ae539b13" containerName="neutron-api" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.457383 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.459268 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xm68z" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.459701 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.459863 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.475724 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.498836 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.533978 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534030 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534074 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534093 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhls\" (UniqueName: \"kubernetes.io/projected/6fee8542-bed9-434a-86c2-709235db9cf0-kube-api-access-dwhls\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534135 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784m6\" (UniqueName: \"kubernetes.io/projected/a5207fd3-b02c-4294-a990-75e9356334b5-kube-api-access-784m6\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534145 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534157 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534169 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.534180 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5207fd3-b02c-4294-a990-75e9356334b5-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.540419 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.546760 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data" (OuterVolumeSpecName: "config-data") pod "a5207fd3-b02c-4294-a990-75e9356334b5" (UID: "a5207fd3-b02c-4294-a990-75e9356334b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635216 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635468 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635564 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635676 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhls\" (UniqueName: \"kubernetes.io/projected/6fee8542-bed9-434a-86c2-709235db9cf0-kube-api-access-dwhls\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635781 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.635841 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5207fd3-b02c-4294-a990-75e9356334b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.636375 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.640268 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.640688 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fee8542-bed9-434a-86c2-709235db9cf0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.654479 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhls\" (UniqueName: \"kubernetes.io/projected/6fee8542-bed9-434a-86c2-709235db9cf0-kube-api-access-dwhls\") pod \"openstackclient\" (UID: \"6fee8542-bed9-434a-86c2-709235db9cf0\") " pod="openstack/openstackclient" Jun 13 05:06:22 crc kubenswrapper[4894]: I0613 05:06:22.790970 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.267590 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.272498 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5207fd3-b02c-4294-a990-75e9356334b5","Type":"ContainerDied","Data":"7522228f7c867ba930983101326deb55cf29e8dcd91e0e8ebfb50221dbad1e55"} Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.272542 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.272548 4894 scope.go:117] "RemoveContainer" containerID="a845cdb91e73eb0a217314b189863df1d78dfd7ac49b517932f6ca0e32194787" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.278692 4894 generic.go:334] "Generic (PLEG): container finished" podID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerID="2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50" exitCode=0 Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.278738 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.278754 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerDied","Data":"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50"} Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.279384 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc22b655-eac4-4af2-9751-77b7953c45bd","Type":"ContainerDied","Data":"a03145815541265e5cfc3ad3d7b2fde802c4283ea0ea52d57689cb5de1881fc7"} Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.299715 4894 scope.go:117] "RemoveContainer" containerID="ce0f8bbb8df896b5567cb3b6b22ab6a17b0e59b89dd268ca89bf58789e8ed85c" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.305565 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.342748 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.348022 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.348647 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.348835 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.348910 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.349009 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.349212 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.349272 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.349334 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcfj\" (UniqueName: \"kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj\") pod \"fc22b655-eac4-4af2-9751-77b7953c45bd\" (UID: \"fc22b655-eac4-4af2-9751-77b7953c45bd\") " Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.350721 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.350824 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.359294 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts" (OuterVolumeSpecName: "scripts") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.365226 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.368730 4894 scope.go:117] "RemoveContainer" containerID="a6158c773d1065c3d665963aed74d5b95d30f4bd6a37163714ffd4c22598d33f" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.369076 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj" (OuterVolumeSpecName: "kube-api-access-rxcfj") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "kube-api-access-rxcfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.395729 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: E0613 05:06:23.396086 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="probe" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.396098 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="probe" Jun 13 05:06:23 crc kubenswrapper[4894]: E0613 05:06:23.396130 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="cinder-scheduler" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.396136 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="cinder-scheduler" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.396321 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="probe" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.396342 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" containerName="cinder-scheduler" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.402648 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.404806 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.418123 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.418316 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.439304 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.443296 4894 scope.go:117] "RemoveContainer" containerID="e4f55b17140f51f71720836ec750eb9a0b10dae8b71fcb387e54d93c480cf674" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451388 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451444 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451607 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451637 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451691 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451714 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2fd\" (UniqueName: \"kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451746 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451806 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451818 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451826 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451836 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451845 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc22b655-eac4-4af2-9751-77b7953c45bd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.451853 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcfj\" (UniqueName: \"kubernetes.io/projected/fc22b655-eac4-4af2-9751-77b7953c45bd-kube-api-access-rxcfj\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.467140 4894 scope.go:117] "RemoveContainer" containerID="72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.488248 4894 scope.go:117] "RemoveContainer" containerID="2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.509088 4894 scope.go:117] "RemoveContainer" containerID="72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894" Jun 13 05:06:23 crc kubenswrapper[4894]: E0613 05:06:23.509507 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894\": container with ID starting with 72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894 not found: ID does not exist" containerID="72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.509540 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894"} err="failed to get container status \"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894\": rpc error: code = NotFound desc = could not find container \"72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894\": container with ID starting with 72a1961086ce9ad30f58ee9b986c0b531f2c42a71fa50e3840cc4c02f9d14894 not found: ID does not exist" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.509562 4894 scope.go:117] "RemoveContainer" containerID="2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50" Jun 13 05:06:23 crc kubenswrapper[4894]: E0613 05:06:23.510260 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50\": container with ID starting with 2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50 not found: ID does not exist" containerID="2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.510284 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50"} err="failed to get container status \"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50\": rpc error: code = NotFound desc = could not find container \"2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50\": container with ID starting with 2c95ee0995583f9b15e620ad1f788da1086dfc78213b2859c8fa44d93c059e50 not found: ID does not exist" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.512311 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data" (OuterVolumeSpecName: "config-data") pod "fc22b655-eac4-4af2-9751-77b7953c45bd" (UID: "fc22b655-eac4-4af2-9751-77b7953c45bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552706 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552750 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552766 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552799 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552820 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2fd\" (UniqueName: \"kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552852 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552910 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.552973 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc22b655-eac4-4af2-9751-77b7953c45bd-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.553270 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.555794 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.556105 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.556880 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.560225 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.571216 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2fd\" (UniqueName: \"kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.572873 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts\") pod \"ceilometer-0\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.664267 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.669494 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.705807 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.708731 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.710054 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.719442 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.740041 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755515 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755564 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755596 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755709 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755764 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755780 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.755845 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdw8w\" (UniqueName: \"kubernetes.io/projected/4f9c3009-19d2-4508-bf6d-11881d3f028a-kube-api-access-hdw8w\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859149 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859387 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859403 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859433 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdw8w\" (UniqueName: \"kubernetes.io/projected/4f9c3009-19d2-4508-bf6d-11881d3f028a-kube-api-access-hdw8w\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859495 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-localtime\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859524 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859552 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.859577 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.860012 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9c3009-19d2-4508-bf6d-11881d3f028a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.871762 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.873227 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.874175 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.889355 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdw8w\" (UniqueName: \"kubernetes.io/projected/4f9c3009-19d2-4508-bf6d-11881d3f028a-kube-api-access-hdw8w\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:23 crc kubenswrapper[4894]: I0613 05:06:23.891725 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9c3009-19d2-4508-bf6d-11881d3f028a-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9c3009-19d2-4508-bf6d-11881d3f028a\") " pod="openstack/cinder-scheduler-0" Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.030033 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.184243 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:24 crc kubenswrapper[4894]: W0613 05:06:24.202092 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f078cff_aba9_412d_80c8_f96dbad6dfc5.slice/crio-eeaa54b3918c35f545b723f776df3bc4cf729fc237c2f586ad618bc28bc96e88 WatchSource:0}: Error finding container eeaa54b3918c35f545b723f776df3bc4cf729fc237c2f586ad618bc28bc96e88: Status 404 returned error can't find the container with id eeaa54b3918c35f545b723f776df3bc4cf729fc237c2f586ad618bc28bc96e88 Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.290920 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e24fd1b-12d2-475d-94b8-1cae51b9fd7a" path="/var/lib/kubelet/pods/9e24fd1b-12d2-475d-94b8-1cae51b9fd7a/volumes" Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.291669 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5207fd3-b02c-4294-a990-75e9356334b5" path="/var/lib/kubelet/pods/a5207fd3-b02c-4294-a990-75e9356334b5/volumes" Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.292304 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc22b655-eac4-4af2-9751-77b7953c45bd" path="/var/lib/kubelet/pods/fc22b655-eac4-4af2-9751-77b7953c45bd/volumes" Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.293558 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerStarted","Data":"eeaa54b3918c35f545b723f776df3bc4cf729fc237c2f586ad618bc28bc96e88"} Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.293586 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6fee8542-bed9-434a-86c2-709235db9cf0","Type":"ContainerStarted","Data":"cb118588105575710408c7a008b552e24c5bb472bc83669e75d6cf5357b8dc1f"} Jun 13 05:06:24 crc kubenswrapper[4894]: I0613 05:06:24.475897 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jun 13 05:06:24 crc kubenswrapper[4894]: W0613 05:06:24.484759 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9c3009_19d2_4508_bf6d_11881d3f028a.slice/crio-94b18f3d1db87a0aaa8ab98a58ba05147fad085e836007258ba42cfa80cb0ad4 WatchSource:0}: Error finding container 94b18f3d1db87a0aaa8ab98a58ba05147fad085e836007258ba42cfa80cb0ad4: Status 404 returned error can't find the container with id 94b18f3d1db87a0aaa8ab98a58ba05147fad085e836007258ba42cfa80cb0ad4 Jun 13 05:06:25 crc kubenswrapper[4894]: I0613 05:06:25.302772 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9c3009-19d2-4508-bf6d-11881d3f028a","Type":"ContainerStarted","Data":"ec55425c3cd06e326e99436e483220efd719e2f95b6ffbde8e498d4949d520d0"} Jun 13 05:06:25 crc kubenswrapper[4894]: I0613 05:06:25.302811 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9c3009-19d2-4508-bf6d-11881d3f028a","Type":"ContainerStarted","Data":"94b18f3d1db87a0aaa8ab98a58ba05147fad085e836007258ba42cfa80cb0ad4"} Jun 13 05:06:25 crc kubenswrapper[4894]: I0613 05:06:25.303709 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerStarted","Data":"6909d41c82285cca40fcb233294c389a3c9511817a026e94d9689d19146274ee"} Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.236616 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.236995 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.237036 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.237670 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.237726 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47" gracePeriod=600 Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.313609 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerStarted","Data":"40229692ae43b8e7288bdacf78b97ef6bb36f199dfac819373ed42ef2a213332"} Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.320231 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9c3009-19d2-4508-bf6d-11881d3f028a","Type":"ContainerStarted","Data":"2b67b86e3a1ab61e61aff70d839c6f53344c83be325c5c8f278054cc6ab12a45"} Jun 13 05:06:26 crc kubenswrapper[4894]: I0613 05:06:26.340323 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.340309471 podStartE2EDuration="3.340309471s" podCreationTimestamp="2025-06-13 05:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:06:26.338469209 +0000 UTC m=+944.784716672" watchObservedRunningTime="2025-06-13 05:06:26.340309471 +0000 UTC m=+944.786556934" Jun 13 05:06:27 crc kubenswrapper[4894]: I0613 05:06:27.331890 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47" exitCode=0 Jun 13 05:06:27 crc kubenswrapper[4894]: I0613 05:06:27.332165 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47"} Jun 13 05:06:27 crc kubenswrapper[4894]: I0613 05:06:27.332189 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6"} Jun 13 05:06:27 crc kubenswrapper[4894]: I0613 05:06:27.332203 4894 scope.go:117] "RemoveContainer" containerID="ecd53f1961aac6210ea5766812553b7eca34bc56e6e6ac062fd75e7b6d67fcbe" Jun 13 05:06:27 crc kubenswrapper[4894]: I0613 05:06:27.336440 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerStarted","Data":"c98c39735cae191de171c6b03c671820900afc2eee2656684ed9f3b3ff450e2a"} Jun 13 05:06:28 crc kubenswrapper[4894]: I0613 05:06:28.352106 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerStarted","Data":"07de5de95618d8bacdc4e69d1f907a307b7ec843c73c2e33c9aeaba50a4dba59"} Jun 13 05:06:28 crc kubenswrapper[4894]: I0613 05:06:28.353772 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:06:28 crc kubenswrapper[4894]: I0613 05:06:28.375316 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.161817983 podStartE2EDuration="5.375303047s" podCreationTimestamp="2025-06-13 05:06:23 +0000 UTC" firstStartedPulling="2025-06-13 05:06:24.206133361 +0000 UTC m=+942.652380824" lastFinishedPulling="2025-06-13 05:06:27.419618425 +0000 UTC m=+945.865865888" observedRunningTime="2025-06-13 05:06:28.367503898 +0000 UTC m=+946.813751361" watchObservedRunningTime="2025-06-13 05:06:28.375303047 +0000 UTC m=+946.821550510" Jun 13 05:06:29 crc kubenswrapper[4894]: I0613 05:06:29.031231 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jun 13 05:06:29 crc kubenswrapper[4894]: I0613 05:06:29.518044 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.774993 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t9lbd"] Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.776482 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.784887 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t9lbd"] Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.815459 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2nmb\" (UniqueName: \"kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb\") pod \"nova-api-db-create-t9lbd\" (UID: \"3eb429d3-d365-4dd2-9a8c-680247079215\") " pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.865994 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-sgg28"] Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.871143 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.879994 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sgg28"] Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.916037 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7xd\" (UniqueName: \"kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd\") pod \"nova-cell0-db-create-sgg28\" (UID: \"e952982d-eac0-4ff8-8817-551eed327bed\") " pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.916097 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2nmb\" (UniqueName: \"kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb\") pod \"nova-api-db-create-t9lbd\" (UID: \"3eb429d3-d365-4dd2-9a8c-680247079215\") " pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.933388 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2nmb\" (UniqueName: \"kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb\") pod \"nova-api-db-create-t9lbd\" (UID: \"3eb429d3-d365-4dd2-9a8c-680247079215\") " pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.973402 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hsnzt"] Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.974301 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:31 crc kubenswrapper[4894]: I0613 05:06:31.987804 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hsnzt"] Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.018022 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7xd\" (UniqueName: \"kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd\") pod \"nova-cell0-db-create-sgg28\" (UID: \"e952982d-eac0-4ff8-8817-551eed327bed\") " pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.018259 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxg9p\" (UniqueName: \"kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p\") pod \"nova-cell1-db-create-hsnzt\" (UID: \"9d33205b-2c37-4b71-900b-a8e83762b63f\") " pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.057241 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7xd\" (UniqueName: \"kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd\") pod \"nova-cell0-db-create-sgg28\" (UID: \"e952982d-eac0-4ff8-8817-551eed327bed\") " pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.095477 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.119694 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxg9p\" (UniqueName: \"kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p\") pod \"nova-cell1-db-create-hsnzt\" (UID: \"9d33205b-2c37-4b71-900b-a8e83762b63f\") " pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.136043 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxg9p\" (UniqueName: \"kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p\") pod \"nova-cell1-db-create-hsnzt\" (UID: \"9d33205b-2c37-4b71-900b-a8e83762b63f\") " pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.184916 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:32 crc kubenswrapper[4894]: I0613 05:06:32.288512 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:34 crc kubenswrapper[4894]: I0613 05:06:34.297982 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jun 13 05:06:35 crc kubenswrapper[4894]: I0613 05:06:35.758394 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t9lbd"] Jun 13 05:06:35 crc kubenswrapper[4894]: W0613 05:06:35.767703 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb429d3_d365_4dd2_9a8c_680247079215.slice/crio-93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11 WatchSource:0}: Error finding container 93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11: Status 404 returned error can't find the container with id 93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11 Jun 13 05:06:35 crc kubenswrapper[4894]: I0613 05:06:35.833174 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sgg28"] Jun 13 05:06:35 crc kubenswrapper[4894]: W0613 05:06:35.868531 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode952982d_eac0_4ff8_8817_551eed327bed.slice/crio-6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31 WatchSource:0}: Error finding container 6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31: Status 404 returned error can't find the container with id 6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31 Jun 13 05:06:35 crc kubenswrapper[4894]: I0613 05:06:35.998993 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hsnzt"] Jun 13 05:06:36 crc kubenswrapper[4894]: W0613 05:06:36.004047 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d33205b_2c37_4b71_900b_a8e83762b63f.slice/crio-f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54 WatchSource:0}: Error finding container f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54: Status 404 returned error can't find the container with id f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54 Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.457589 4894 generic.go:334] "Generic (PLEG): container finished" podID="3eb429d3-d365-4dd2-9a8c-680247079215" containerID="c6d46d7984c702bd799dff7331a0b0c8020ddae459044cce2084d76338fe1bad" exitCode=0 Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.457677 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t9lbd" event={"ID":"3eb429d3-d365-4dd2-9a8c-680247079215","Type":"ContainerDied","Data":"c6d46d7984c702bd799dff7331a0b0c8020ddae459044cce2084d76338fe1bad"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.457705 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t9lbd" event={"ID":"3eb429d3-d365-4dd2-9a8c-680247079215","Type":"ContainerStarted","Data":"93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.460223 4894 generic.go:334] "Generic (PLEG): container finished" podID="e952982d-eac0-4ff8-8817-551eed327bed" containerID="aafb0254c609ddb94ce465f092dced91a23b4fd418ddc153f1988d06bc4ce1ee" exitCode=0 Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.460290 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sgg28" event={"ID":"e952982d-eac0-4ff8-8817-551eed327bed","Type":"ContainerDied","Data":"aafb0254c609ddb94ce465f092dced91a23b4fd418ddc153f1988d06bc4ce1ee"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.460335 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sgg28" event={"ID":"e952982d-eac0-4ff8-8817-551eed327bed","Type":"ContainerStarted","Data":"6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.462119 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6fee8542-bed9-434a-86c2-709235db9cf0","Type":"ContainerStarted","Data":"67752c01cc7421ae96b85193a4807403a6a7e5865ea3df659dd6d7d22223e8f8"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.463862 4894 generic.go:334] "Generic (PLEG): container finished" podID="9d33205b-2c37-4b71-900b-a8e83762b63f" containerID="4beda1607aaafccd72fa80f4b61226123ba8d80f05e896cef806cbe77d0cc0e8" exitCode=0 Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.463902 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hsnzt" event={"ID":"9d33205b-2c37-4b71-900b-a8e83762b63f","Type":"ContainerDied","Data":"4beda1607aaafccd72fa80f4b61226123ba8d80f05e896cef806cbe77d0cc0e8"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.463919 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hsnzt" event={"ID":"9d33205b-2c37-4b71-900b-a8e83762b63f","Type":"ContainerStarted","Data":"f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54"} Jun 13 05:06:36 crc kubenswrapper[4894]: I0613 05:06:36.517000 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.496768406 podStartE2EDuration="14.516984324s" podCreationTimestamp="2025-06-13 05:06:22 +0000 UTC" firstStartedPulling="2025-06-13 05:06:23.401547013 +0000 UTC m=+941.847794466" lastFinishedPulling="2025-06-13 05:06:35.421762921 +0000 UTC m=+953.868010384" observedRunningTime="2025-06-13 05:06:36.514854434 +0000 UTC m=+954.961101897" watchObservedRunningTime="2025-06-13 05:06:36.516984324 +0000 UTC m=+954.963231777" Jun 13 05:06:37 crc kubenswrapper[4894]: I0613 05:06:37.866474 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:37 crc kubenswrapper[4894]: I0613 05:06:37.930565 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxg9p\" (UniqueName: \"kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p\") pod \"9d33205b-2c37-4b71-900b-a8e83762b63f\" (UID: \"9d33205b-2c37-4b71-900b-a8e83762b63f\") " Jun 13 05:06:37 crc kubenswrapper[4894]: I0613 05:06:37.936980 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p" (OuterVolumeSpecName: "kube-api-access-qxg9p") pod "9d33205b-2c37-4b71-900b-a8e83762b63f" (UID: "9d33205b-2c37-4b71-900b-a8e83762b63f"). InnerVolumeSpecName "kube-api-access-qxg9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:37 crc kubenswrapper[4894]: I0613 05:06:37.996169 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.001508 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.033586 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxg9p\" (UniqueName: \"kubernetes.io/projected/9d33205b-2c37-4b71-900b-a8e83762b63f-kube-api-access-qxg9p\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.134503 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2nmb\" (UniqueName: \"kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb\") pod \"3eb429d3-d365-4dd2-9a8c-680247079215\" (UID: \"3eb429d3-d365-4dd2-9a8c-680247079215\") " Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.134605 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7xd\" (UniqueName: \"kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd\") pod \"e952982d-eac0-4ff8-8817-551eed327bed\" (UID: \"e952982d-eac0-4ff8-8817-551eed327bed\") " Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.138011 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb" (OuterVolumeSpecName: "kube-api-access-z2nmb") pod "3eb429d3-d365-4dd2-9a8c-680247079215" (UID: "3eb429d3-d365-4dd2-9a8c-680247079215"). InnerVolumeSpecName "kube-api-access-z2nmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.138943 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd" (OuterVolumeSpecName: "kube-api-access-vb7xd") pod "e952982d-eac0-4ff8-8817-551eed327bed" (UID: "e952982d-eac0-4ff8-8817-551eed327bed"). InnerVolumeSpecName "kube-api-access-vb7xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.237699 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2nmb\" (UniqueName: \"kubernetes.io/projected/3eb429d3-d365-4dd2-9a8c-680247079215-kube-api-access-z2nmb\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.237736 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7xd\" (UniqueName: \"kubernetes.io/projected/e952982d-eac0-4ff8-8817-551eed327bed-kube-api-access-vb7xd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.488387 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t9lbd" event={"ID":"3eb429d3-d365-4dd2-9a8c-680247079215","Type":"ContainerDied","Data":"93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11"} Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.488697 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b15696dd7b0ef46ce901af92ce8334ac5f77c01c83c99237c97c1475e48c11" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.488440 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t9lbd" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.490804 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sgg28" event={"ID":"e952982d-eac0-4ff8-8817-551eed327bed","Type":"ContainerDied","Data":"6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31"} Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.490861 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8f1b12972d0ed91741b3569ce322ab35704e4467d3cda87e44dda74fd12b31" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.490879 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sgg28" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.492819 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hsnzt" event={"ID":"9d33205b-2c37-4b71-900b-a8e83762b63f","Type":"ContainerDied","Data":"f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54"} Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.492857 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0839db35ab8e31a89fafe4e789110fa1b67765970225cd31634731e1fbbfa54" Jun 13 05:06:38 crc kubenswrapper[4894]: I0613 05:06:38.492932 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hsnzt" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.820631 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c7ad-account-create-5vctm"] Jun 13 05:06:51 crc kubenswrapper[4894]: E0613 05:06:51.823045 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e952982d-eac0-4ff8-8817-551eed327bed" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.823155 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e952982d-eac0-4ff8-8817-551eed327bed" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: E0613 05:06:51.823253 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d33205b-2c37-4b71-900b-a8e83762b63f" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.823329 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d33205b-2c37-4b71-900b-a8e83762b63f" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: E0613 05:06:51.823436 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb429d3-d365-4dd2-9a8c-680247079215" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.823515 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb429d3-d365-4dd2-9a8c-680247079215" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.823790 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e952982d-eac0-4ff8-8817-551eed327bed" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.823915 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d33205b-2c37-4b71-900b-a8e83762b63f" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.824105 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb429d3-d365-4dd2-9a8c-680247079215" containerName="mariadb-database-create" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.824761 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.827206 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.868138 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7ad-account-create-5vctm"] Jun 13 05:06:51 crc kubenswrapper[4894]: I0613 05:06:51.969761 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhdd\" (UniqueName: \"kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd\") pod \"nova-api-c7ad-account-create-5vctm\" (UID: \"28eb4b39-d8e5-4dde-ae16-931e82a524d6\") " pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.012895 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7c5d-account-create-rnzd5"] Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.013865 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.016288 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.032829 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7c5d-account-create-rnzd5"] Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.071812 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhdd\" (UniqueName: \"kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd\") pod \"nova-api-c7ad-account-create-5vctm\" (UID: \"28eb4b39-d8e5-4dde-ae16-931e82a524d6\") " pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.097562 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhdd\" (UniqueName: \"kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd\") pod \"nova-api-c7ad-account-create-5vctm\" (UID: \"28eb4b39-d8e5-4dde-ae16-931e82a524d6\") " pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.169569 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.173909 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2cr\" (UniqueName: \"kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr\") pod \"nova-cell0-7c5d-account-create-rnzd5\" (UID: \"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa\") " pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.208650 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-edbe-account-create-jzgzz"] Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.209802 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.211543 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.228554 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-edbe-account-create-jzgzz"] Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.278817 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2cr\" (UniqueName: \"kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr\") pod \"nova-cell0-7c5d-account-create-rnzd5\" (UID: \"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa\") " pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.314761 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2cr\" (UniqueName: \"kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr\") pod \"nova-cell0-7c5d-account-create-rnzd5\" (UID: \"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa\") " pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.363321 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.379677 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zwr\" (UniqueName: \"kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr\") pod \"nova-cell1-edbe-account-create-jzgzz\" (UID: \"81bbe296-097f-4020-bd12-476dfc968482\") " pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.481784 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zwr\" (UniqueName: \"kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr\") pod \"nova-cell1-edbe-account-create-jzgzz\" (UID: \"81bbe296-097f-4020-bd12-476dfc968482\") " pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.500446 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zwr\" (UniqueName: \"kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr\") pod \"nova-cell1-edbe-account-create-jzgzz\" (UID: \"81bbe296-097f-4020-bd12-476dfc968482\") " pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.576616 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.665542 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7ad-account-create-5vctm"] Jun 13 05:06:52 crc kubenswrapper[4894]: I0613 05:06:52.811326 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7c5d-account-create-rnzd5"] Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.043506 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-edbe-account-create-jzgzz"] Jun 13 05:06:53 crc kubenswrapper[4894]: W0613 05:06:53.112996 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bbe296_097f_4020_bd12_476dfc968482.slice/crio-fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db WatchSource:0}: Error finding container fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db: Status 404 returned error can't find the container with id fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.655983 4894 generic.go:334] "Generic (PLEG): container finished" podID="28eb4b39-d8e5-4dde-ae16-931e82a524d6" containerID="f37dadb62cd3fffdd7fbe1898bf07bfb37f8de28814fe1919c14cecf93e655c3" exitCode=0 Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.656054 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-5vctm" event={"ID":"28eb4b39-d8e5-4dde-ae16-931e82a524d6","Type":"ContainerDied","Data":"f37dadb62cd3fffdd7fbe1898bf07bfb37f8de28814fe1919c14cecf93e655c3"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.656497 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-5vctm" event={"ID":"28eb4b39-d8e5-4dde-ae16-931e82a524d6","Type":"ContainerStarted","Data":"c345a80afc0fba73dc4e6fa92a080a45920e1d93c6865889b675352c4031c108"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.658812 4894 generic.go:334] "Generic (PLEG): container finished" podID="fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" containerID="b7501545ceeb8b2797eb76f0fbad6c45682cf63db86e6e58d401dc5368c14243" exitCode=0 Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.658955 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" event={"ID":"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa","Type":"ContainerDied","Data":"b7501545ceeb8b2797eb76f0fbad6c45682cf63db86e6e58d401dc5368c14243"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.658975 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" event={"ID":"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa","Type":"ContainerStarted","Data":"2dbcc8d4a553227809507c225b47b45353a87feaf6ecb4b6a9c7fbadc0d9169f"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.670608 4894 generic.go:334] "Generic (PLEG): container finished" podID="81bbe296-097f-4020-bd12-476dfc968482" containerID="ec261e4aba0a91e1b40ef5fcb71e22a63b7e01095b2ed40b51fa369bbbf4fec5" exitCode=0 Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.670672 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-edbe-account-create-jzgzz" event={"ID":"81bbe296-097f-4020-bd12-476dfc968482","Type":"ContainerDied","Data":"ec261e4aba0a91e1b40ef5fcb71e22a63b7e01095b2ed40b51fa369bbbf4fec5"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.670718 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-edbe-account-create-jzgzz" event={"ID":"81bbe296-097f-4020-bd12-476dfc968482","Type":"ContainerStarted","Data":"fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db"} Jun 13 05:06:53 crc kubenswrapper[4894]: I0613 05:06:53.755199 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.027529 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.138895 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zwr\" (UniqueName: \"kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr\") pod \"81bbe296-097f-4020-bd12-476dfc968482\" (UID: \"81bbe296-097f-4020-bd12-476dfc968482\") " Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.179861 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr" (OuterVolumeSpecName: "kube-api-access-67zwr") pod "81bbe296-097f-4020-bd12-476dfc968482" (UID: "81bbe296-097f-4020-bd12-476dfc968482"). InnerVolumeSpecName "kube-api-access-67zwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.242715 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67zwr\" (UniqueName: \"kubernetes.io/projected/81bbe296-097f-4020-bd12-476dfc968482-kube-api-access-67zwr\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.246478 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.297779 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.343376 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc2cr\" (UniqueName: \"kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr\") pod \"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa\" (UID: \"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa\") " Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.365944 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr" (OuterVolumeSpecName: "kube-api-access-lc2cr") pod "fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" (UID: "fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa"). InnerVolumeSpecName "kube-api-access-lc2cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.444721 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhdd\" (UniqueName: \"kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd\") pod \"28eb4b39-d8e5-4dde-ae16-931e82a524d6\" (UID: \"28eb4b39-d8e5-4dde-ae16-931e82a524d6\") " Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.445525 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc2cr\" (UniqueName: \"kubernetes.io/projected/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa-kube-api-access-lc2cr\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.448500 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd" (OuterVolumeSpecName: "kube-api-access-xkhdd") pod "28eb4b39-d8e5-4dde-ae16-931e82a524d6" (UID: "28eb4b39-d8e5-4dde-ae16-931e82a524d6"). InnerVolumeSpecName "kube-api-access-xkhdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.547391 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhdd\" (UniqueName: \"kubernetes.io/projected/28eb4b39-d8e5-4dde-ae16-931e82a524d6-kube-api-access-xkhdd\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.693673 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-5vctm" event={"ID":"28eb4b39-d8e5-4dde-ae16-931e82a524d6","Type":"ContainerDied","Data":"c345a80afc0fba73dc4e6fa92a080a45920e1d93c6865889b675352c4031c108"} Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.693872 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c345a80afc0fba73dc4e6fa92a080a45920e1d93c6865889b675352c4031c108" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.693888 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-5vctm" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.695399 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" event={"ID":"fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa","Type":"ContainerDied","Data":"2dbcc8d4a553227809507c225b47b45353a87feaf6ecb4b6a9c7fbadc0d9169f"} Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.695423 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbcc8d4a553227809507c225b47b45353a87feaf6ecb4b6a9c7fbadc0d9169f" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.695462 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c5d-account-create-rnzd5" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.706188 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-edbe-account-create-jzgzz" event={"ID":"81bbe296-097f-4020-bd12-476dfc968482","Type":"ContainerDied","Data":"fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db"} Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.706294 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5e1a43cc4bc2e7f8f244e0aaca740e8fbaae23b1b47c5b2210546b58c9c0db" Jun 13 05:06:55 crc kubenswrapper[4894]: I0613 05:06:55.706373 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-edbe-account-create-jzgzz" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.255420 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gb6nq"] Jun 13 05:06:57 crc kubenswrapper[4894]: E0613 05:06:57.256013 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256027 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: E0613 05:06:57.256042 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28eb4b39-d8e5-4dde-ae16-931e82a524d6" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256049 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="28eb4b39-d8e5-4dde-ae16-931e82a524d6" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: E0613 05:06:57.256057 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bbe296-097f-4020-bd12-476dfc968482" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256062 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bbe296-097f-4020-bd12-476dfc968482" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256207 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="28eb4b39-d8e5-4dde-ae16-931e82a524d6" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256218 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.256231 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bbe296-097f-4020-bd12-476dfc968482" containerName="mariadb-account-create" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.262444 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.264935 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qjs74" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.265981 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.266502 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.271128 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gb6nq"] Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.392188 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.392242 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw52m\" (UniqueName: \"kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.392327 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.392389 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.473742 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.473980 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="eaa87fe1-544c-4780-a350-acb43e14d346" containerName="kube-state-metrics" containerID="cri-o://7e03455584cb09199bc2f4a8d1aa180aebc9806ba69ee48f6c8d328382dc8639" gracePeriod=30 Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.493911 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.493980 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw52m\" (UniqueName: \"kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.494054 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.494127 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.506599 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.509048 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.516522 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.529452 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw52m\" (UniqueName: \"kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m\") pod \"nova-cell0-conductor-db-sync-gb6nq\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.605292 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.756158 4894 generic.go:334] "Generic (PLEG): container finished" podID="eaa87fe1-544c-4780-a350-acb43e14d346" containerID="7e03455584cb09199bc2f4a8d1aa180aebc9806ba69ee48f6c8d328382dc8639" exitCode=2 Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.756236 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eaa87fe1-544c-4780-a350-acb43e14d346","Type":"ContainerDied","Data":"7e03455584cb09199bc2f4a8d1aa180aebc9806ba69ee48f6c8d328382dc8639"} Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.903248 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:06:57 crc kubenswrapper[4894]: I0613 05:06:57.960253 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gb6nq"] Jun 13 05:06:57 crc kubenswrapper[4894]: W0613 05:06:57.962122 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d3ed6c_aff6_4d8c_a6c6_05341fb1a048.slice/crio-bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6 WatchSource:0}: Error finding container bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6: Status 404 returned error can't find the container with id bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6 Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.003048 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqvq\" (UniqueName: \"kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq\") pod \"eaa87fe1-544c-4780-a350-acb43e14d346\" (UID: \"eaa87fe1-544c-4780-a350-acb43e14d346\") " Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.011756 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq" (OuterVolumeSpecName: "kube-api-access-fsqvq") pod "eaa87fe1-544c-4780-a350-acb43e14d346" (UID: "eaa87fe1-544c-4780-a350-acb43e14d346"). InnerVolumeSpecName "kube-api-access-fsqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.104731 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqvq\" (UniqueName: \"kubernetes.io/projected/eaa87fe1-544c-4780-a350-acb43e14d346-kube-api-access-fsqvq\") on node \"crc\" DevicePath \"\"" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.778371 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eaa87fe1-544c-4780-a350-acb43e14d346","Type":"ContainerDied","Data":"068f6e763ee6b33a6e948b3a9434ad07499175da44552bf35f033683890776ec"} Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.778915 4894 scope.go:117] "RemoveContainer" containerID="7e03455584cb09199bc2f4a8d1aa180aebc9806ba69ee48f6c8d328382dc8639" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.778383 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.779897 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" event={"ID":"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048","Type":"ContainerStarted","Data":"bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6"} Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.808807 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.818124 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.836471 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:58 crc kubenswrapper[4894]: E0613 05:06:58.836903 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa87fe1-544c-4780-a350-acb43e14d346" containerName="kube-state-metrics" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.836923 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa87fe1-544c-4780-a350-acb43e14d346" containerName="kube-state-metrics" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.837072 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa87fe1-544c-4780-a350-acb43e14d346" containerName="kube-state-metrics" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.837667 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.845184 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.849672 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.849910 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.920260 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v4z5\" (UniqueName: \"kubernetes.io/projected/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-api-access-6v4z5\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.920323 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.920395 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.920431 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.960183 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.960460 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-central-agent" containerID="cri-o://6909d41c82285cca40fcb233294c389a3c9511817a026e94d9689d19146274ee" gracePeriod=30 Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.960789 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="proxy-httpd" containerID="cri-o://07de5de95618d8bacdc4e69d1f907a307b7ec843c73c2e33c9aeaba50a4dba59" gracePeriod=30 Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.960884 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="sg-core" containerID="cri-o://c98c39735cae191de171c6b03c671820900afc2eee2656684ed9f3b3ff450e2a" gracePeriod=30 Jun 13 05:06:58 crc kubenswrapper[4894]: I0613 05:06:58.960957 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-notification-agent" containerID="cri-o://40229692ae43b8e7288bdacf78b97ef6bb36f199dfac819373ed42ef2a213332" gracePeriod=30 Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.021623 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v4z5\" (UniqueName: \"kubernetes.io/projected/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-api-access-6v4z5\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.021748 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.021855 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.021897 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.030056 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.031019 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.031988 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5acc39-066b-40b1-abcf-b5311aea15d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.041835 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v4z5\" (UniqueName: \"kubernetes.io/projected/5c5acc39-066b-40b1-abcf-b5311aea15d9-kube-api-access-6v4z5\") pod \"kube-state-metrics-0\" (UID: \"5c5acc39-066b-40b1-abcf-b5311aea15d9\") " pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.181239 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.623824 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793069 4894 generic.go:334] "Generic (PLEG): container finished" podID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerID="07de5de95618d8bacdc4e69d1f907a307b7ec843c73c2e33c9aeaba50a4dba59" exitCode=0 Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793097 4894 generic.go:334] "Generic (PLEG): container finished" podID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerID="c98c39735cae191de171c6b03c671820900afc2eee2656684ed9f3b3ff450e2a" exitCode=2 Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793105 4894 generic.go:334] "Generic (PLEG): container finished" podID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerID="6909d41c82285cca40fcb233294c389a3c9511817a026e94d9689d19146274ee" exitCode=0 Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793106 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerDied","Data":"07de5de95618d8bacdc4e69d1f907a307b7ec843c73c2e33c9aeaba50a4dba59"} Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793146 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerDied","Data":"c98c39735cae191de171c6b03c671820900afc2eee2656684ed9f3b3ff450e2a"} Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.793156 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerDied","Data":"6909d41c82285cca40fcb233294c389a3c9511817a026e94d9689d19146274ee"} Jun 13 05:06:59 crc kubenswrapper[4894]: I0613 05:06:59.794106 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c5acc39-066b-40b1-abcf-b5311aea15d9","Type":"ContainerStarted","Data":"d91f39ebdcd1e175dfe04ac56297d67312750bc6be5df2441bf5a57471952053"} Jun 13 05:07:00 crc kubenswrapper[4894]: I0613 05:07:00.286093 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa87fe1-544c-4780-a350-acb43e14d346" path="/var/lib/kubelet/pods/eaa87fe1-544c-4780-a350-acb43e14d346/volumes" Jun 13 05:07:00 crc kubenswrapper[4894]: I0613 05:07:00.806916 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c5acc39-066b-40b1-abcf-b5311aea15d9","Type":"ContainerStarted","Data":"c723cc6147ac7c4eb2c6bbe325bad3dfaf3e74ce4c7e44f75ffee2d7afd9142f"} Jun 13 05:07:00 crc kubenswrapper[4894]: I0613 05:07:00.808095 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jun 13 05:07:00 crc kubenswrapper[4894]: I0613 05:07:00.832115 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.469699242 podStartE2EDuration="2.832098837s" podCreationTimestamp="2025-06-13 05:06:58 +0000 UTC" firstStartedPulling="2025-06-13 05:06:59.637305002 +0000 UTC m=+978.083552465" lastFinishedPulling="2025-06-13 05:06:59.999704597 +0000 UTC m=+978.445952060" observedRunningTime="2025-06-13 05:07:00.826840059 +0000 UTC m=+979.273087532" watchObservedRunningTime="2025-06-13 05:07:00.832098837 +0000 UTC m=+979.278346320" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.810999 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-nffhn"] Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.811973 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nffhn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.814707 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.874963 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.875227 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbtn\" (UniqueName: \"kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.978453 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbtn\" (UniqueName: \"kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.978796 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:01 crc kubenswrapper[4894]: I0613 05:07:01.978980 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:02 crc kubenswrapper[4894]: I0613 05:07:02.011284 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbtn\" (UniqueName: \"kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn\") pod \"crc-debug-nffhn\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " pod="openstack/crc-debug-nffhn" Jun 13 05:07:02 crc kubenswrapper[4894]: I0613 05:07:02.134615 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nffhn" Jun 13 05:07:02 crc kubenswrapper[4894]: I0613 05:07:02.831947 4894 generic.go:334] "Generic (PLEG): container finished" podID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerID="40229692ae43b8e7288bdacf78b97ef6bb36f199dfac819373ed42ef2a213332" exitCode=0 Jun 13 05:07:02 crc kubenswrapper[4894]: I0613 05:07:02.831981 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerDied","Data":"40229692ae43b8e7288bdacf78b97ef6bb36f199dfac819373ed42ef2a213332"} Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.375273 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474151 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474588 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474633 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474685 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474753 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474795 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2fd\" (UniqueName: \"kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.474852 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data\") pod \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\" (UID: \"2f078cff-aba9-412d-80c8-f96dbad6dfc5\") " Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.476467 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.477957 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.483996 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd" (OuterVolumeSpecName: "kube-api-access-fw2fd") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "kube-api-access-fw2fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.484223 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts" (OuterVolumeSpecName: "scripts") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.524828 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.563403 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583622 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583671 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583681 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f078cff-aba9-412d-80c8-f96dbad6dfc5-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583689 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583698 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.583707 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2fd\" (UniqueName: \"kubernetes.io/projected/2f078cff-aba9-412d-80c8-f96dbad6dfc5-kube-api-access-fw2fd\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.694754 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data" (OuterVolumeSpecName: "config-data") pod "2f078cff-aba9-412d-80c8-f96dbad6dfc5" (UID: "2f078cff-aba9-412d-80c8-f96dbad6dfc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.787315 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f078cff-aba9-412d-80c8-f96dbad6dfc5-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.876217 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-nffhn" event={"ID":"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db","Type":"ContainerStarted","Data":"c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3"} Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.876256 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-nffhn" event={"ID":"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db","Type":"ContainerStarted","Data":"66d994f92be9124fb66f80a18b5be972086b035a61873f2b2268f223b0b38e35"} Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.879637 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f078cff-aba9-412d-80c8-f96dbad6dfc5","Type":"ContainerDied","Data":"eeaa54b3918c35f545b723f776df3bc4cf729fc237c2f586ad618bc28bc96e88"} Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.879692 4894 scope.go:117] "RemoveContainer" containerID="07de5de95618d8bacdc4e69d1f907a307b7ec843c73c2e33c9aeaba50a4dba59" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.879788 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.890289 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" event={"ID":"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048","Type":"ContainerStarted","Data":"72acc3ef48c68ca48e5567f7ae89fb63a4cb5632da4e751ac791c51d234c886f"} Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.890721 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-nffhn" podStartSLOduration=5.890708223 podStartE2EDuration="5.890708223s" podCreationTimestamp="2025-06-13 05:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:06.887676168 +0000 UTC m=+985.333923631" watchObservedRunningTime="2025-06-13 05:07:06.890708223 +0000 UTC m=+985.336955686" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.909734 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" podStartSLOduration=1.648905507 podStartE2EDuration="9.909720049s" podCreationTimestamp="2025-06-13 05:06:57 +0000 UTC" firstStartedPulling="2025-06-13 05:06:57.964158104 +0000 UTC m=+976.410405567" lastFinishedPulling="2025-06-13 05:07:06.224972646 +0000 UTC m=+984.671220109" observedRunningTime="2025-06-13 05:07:06.90442813 +0000 UTC m=+985.350675593" watchObservedRunningTime="2025-06-13 05:07:06.909720049 +0000 UTC m=+985.355967512" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.922395 4894 scope.go:117] "RemoveContainer" containerID="c98c39735cae191de171c6b03c671820900afc2eee2656684ed9f3b3ff450e2a" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.924405 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.935531 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.948785 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:07:06 crc kubenswrapper[4894]: E0613 05:07:06.949107 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="proxy-httpd" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.949123 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="proxy-httpd" Jun 13 05:07:06 crc kubenswrapper[4894]: E0613 05:07:06.949159 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-central-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.949165 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-central-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: E0613 05:07:06.949181 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-notification-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.949188 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-notification-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: E0613 05:07:06.949206 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="sg-core" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.949211 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="sg-core" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.953517 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="proxy-httpd" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.959760 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="sg-core" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.959799 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-central-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.959824 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" containerName="ceilometer-notification-agent" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.961511 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.962059 4894 scope.go:117] "RemoveContainer" containerID="40229692ae43b8e7288bdacf78b97ef6bb36f199dfac819373ed42ef2a213332" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.963931 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.964087 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.966170 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jun 13 05:07:06 crc kubenswrapper[4894]: I0613 05:07:06.970876 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.024152 4894 scope.go:117] "RemoveContainer" containerID="6909d41c82285cca40fcb233294c389a3c9511817a026e94d9689d19146274ee" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092453 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092507 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092541 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092555 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092586 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092625 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dddm\" (UniqueName: \"kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092669 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.092686 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.193920 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194137 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194258 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194339 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194420 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194510 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dddm\" (UniqueName: \"kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194592 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.194681 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.195109 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.195309 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.198645 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.199353 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.200375 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.202060 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.209833 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.213928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dddm\" (UniqueName: \"kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm\") pod \"ceilometer-0\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.318998 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.785030 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:07:07 crc kubenswrapper[4894]: W0613 05:07:07.787931 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e0b9d8_63c6_48d1_88e3_2402eb6002a2.slice/crio-3e0e645ce26a3dfac3cb8c5fba8e5bc15dfe8da946fcbe19c63caa0da8226d91 WatchSource:0}: Error finding container 3e0e645ce26a3dfac3cb8c5fba8e5bc15dfe8da946fcbe19c63caa0da8226d91: Status 404 returned error can't find the container with id 3e0e645ce26a3dfac3cb8c5fba8e5bc15dfe8da946fcbe19c63caa0da8226d91 Jun 13 05:07:07 crc kubenswrapper[4894]: I0613 05:07:07.898850 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerStarted","Data":"3e0e645ce26a3dfac3cb8c5fba8e5bc15dfe8da946fcbe19c63caa0da8226d91"} Jun 13 05:07:08 crc kubenswrapper[4894]: I0613 05:07:08.286983 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f078cff-aba9-412d-80c8-f96dbad6dfc5" path="/var/lib/kubelet/pods/2f078cff-aba9-412d-80c8-f96dbad6dfc5/volumes" Jun 13 05:07:08 crc kubenswrapper[4894]: I0613 05:07:08.909767 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerStarted","Data":"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea"} Jun 13 05:07:09 crc kubenswrapper[4894]: I0613 05:07:09.203229 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jun 13 05:07:09 crc kubenswrapper[4894]: I0613 05:07:09.920627 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerStarted","Data":"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093"} Jun 13 05:07:10 crc kubenswrapper[4894]: I0613 05:07:10.933991 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerStarted","Data":"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56"} Jun 13 05:07:11 crc kubenswrapper[4894]: I0613 05:07:11.944242 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerStarted","Data":"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a"} Jun 13 05:07:11 crc kubenswrapper[4894]: I0613 05:07:11.945536 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:07:11 crc kubenswrapper[4894]: I0613 05:07:11.968277 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141805955 podStartE2EDuration="5.968258261s" podCreationTimestamp="2025-06-13 05:07:06 +0000 UTC" firstStartedPulling="2025-06-13 05:07:07.790168472 +0000 UTC m=+986.236415935" lastFinishedPulling="2025-06-13 05:07:11.616620788 +0000 UTC m=+990.062868241" observedRunningTime="2025-06-13 05:07:11.961696046 +0000 UTC m=+990.407943509" watchObservedRunningTime="2025-06-13 05:07:11.968258261 +0000 UTC m=+990.414505724" Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.023366 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-nffhn"] Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.024220 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-nffhn" podUID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" containerName="container-00" containerID="cri-o://c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3" gracePeriod=2 Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.043230 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-nffhn"] Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.134593 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nffhn" Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.187337 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbtn\" (UniqueName: \"kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn\") pod \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.187438 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host\") pod \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\" (UID: \"ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db\") " Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.187617 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host" (OuterVolumeSpecName: "host") pod "ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" (UID: "ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.188007 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.196497 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn" (OuterVolumeSpecName: "kube-api-access-kxbtn") pod "ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" (UID: "ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db"). InnerVolumeSpecName "kube-api-access-kxbtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:17 crc kubenswrapper[4894]: I0613 05:07:17.298544 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbtn\" (UniqueName: \"kubernetes.io/projected/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db-kube-api-access-kxbtn\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.007633 4894 generic.go:334] "Generic (PLEG): container finished" podID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" containerID="c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3" exitCode=0 Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.008213 4894 scope.go:117] "RemoveContainer" containerID="c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3" Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.008416 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nffhn" Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.052979 4894 scope.go:117] "RemoveContainer" containerID="c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3" Jun 13 05:07:18 crc kubenswrapper[4894]: E0613 05:07:18.053626 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3\": container with ID starting with c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3 not found: ID does not exist" containerID="c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3" Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.053903 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3"} err="failed to get container status \"c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3\": rpc error: code = NotFound desc = could not find container \"c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3\": container with ID starting with c028ce7cca1efb4d3520f9167e5effa152e2909c2ec765fa5670866863f4fce3 not found: ID does not exist" Jun 13 05:07:18 crc kubenswrapper[4894]: I0613 05:07:18.299826 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" path="/var/lib/kubelet/pods/ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db/volumes" Jun 13 05:07:19 crc kubenswrapper[4894]: I0613 05:07:19.020803 4894 generic.go:334] "Generic (PLEG): container finished" podID="49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" containerID="72acc3ef48c68ca48e5567f7ae89fb63a4cb5632da4e751ac791c51d234c886f" exitCode=0 Jun 13 05:07:19 crc kubenswrapper[4894]: I0613 05:07:19.020913 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" event={"ID":"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048","Type":"ContainerDied","Data":"72acc3ef48c68ca48e5567f7ae89fb63a4cb5632da4e751ac791c51d234c886f"} Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.426843 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.594066 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data\") pod \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.594254 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw52m\" (UniqueName: \"kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m\") pod \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.594397 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle\") pod \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.594499 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts\") pod \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\" (UID: \"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048\") " Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.599607 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m" (OuterVolumeSpecName: "kube-api-access-zw52m") pod "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" (UID: "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048"). InnerVolumeSpecName "kube-api-access-zw52m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.613023 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts" (OuterVolumeSpecName: "scripts") pod "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" (UID: "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.627674 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" (UID: "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.633918 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data" (OuterVolumeSpecName: "config-data") pod "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" (UID: "49d3ed6c-aff6-4d8c-a6c6-05341fb1a048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.696177 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw52m\" (UniqueName: \"kubernetes.io/projected/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-kube-api-access-zw52m\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.696206 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.696219 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:20 crc kubenswrapper[4894]: I0613 05:07:20.696231 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.050172 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" event={"ID":"49d3ed6c-aff6-4d8c-a6c6-05341fb1a048","Type":"ContainerDied","Data":"bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6"} Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.050247 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf065f62351b6dd4674a53b797134fae4ed6b09fa3c06148c662cbbd1f6cfaa6" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.050349 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gb6nq" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.187813 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jun 13 05:07:21 crc kubenswrapper[4894]: E0613 05:07:21.188240 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" containerName="nova-cell0-conductor-db-sync" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.188261 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" containerName="nova-cell0-conductor-db-sync" Jun 13 05:07:21 crc kubenswrapper[4894]: E0613 05:07:21.188280 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" containerName="container-00" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.188288 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" containerName="container-00" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.188493 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" containerName="nova-cell0-conductor-db-sync" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.188525 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd914f3-e511-4fc4-bb0a-1e96ad3cd1db" containerName="container-00" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.189197 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.192062 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qjs74" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.192263 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.201905 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.307979 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.308192 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg68t\" (UniqueName: \"kubernetes.io/projected/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-kube-api-access-bg68t\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.308499 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.410612 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg68t\" (UniqueName: \"kubernetes.io/projected/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-kube-api-access-bg68t\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.410932 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.411106 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.419355 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.419380 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.436841 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg68t\" (UniqueName: \"kubernetes.io/projected/440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5-kube-api-access-bg68t\") pod \"nova-cell0-conductor-0\" (UID: \"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5\") " pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:21 crc kubenswrapper[4894]: I0613 05:07:21.516545 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:22 crc kubenswrapper[4894]: I0613 05:07:22.019332 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jun 13 05:07:22 crc kubenswrapper[4894]: W0613 05:07:22.027493 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod440e7806_d1b2_4fc0_9d66_11f4dfa5f4a5.slice/crio-a3714118c78aa7af84226614627f5c7d0c6ab09bb4ad91a1563376083a05a45e WatchSource:0}: Error finding container a3714118c78aa7af84226614627f5c7d0c6ab09bb4ad91a1563376083a05a45e: Status 404 returned error can't find the container with id a3714118c78aa7af84226614627f5c7d0c6ab09bb4ad91a1563376083a05a45e Jun 13 05:07:22 crc kubenswrapper[4894]: I0613 05:07:22.064315 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5","Type":"ContainerStarted","Data":"a3714118c78aa7af84226614627f5c7d0c6ab09bb4ad91a1563376083a05a45e"} Jun 13 05:07:23 crc kubenswrapper[4894]: I0613 05:07:23.076600 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5","Type":"ContainerStarted","Data":"78e3cdd03cb0822ae3d3aede720d9ff6a0eb7fc35ede57a31ae833b5c88c5daf"} Jun 13 05:07:23 crc kubenswrapper[4894]: I0613 05:07:23.078540 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:23 crc kubenswrapper[4894]: I0613 05:07:23.102003 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.101981666 podStartE2EDuration="2.101981666s" podCreationTimestamp="2025-06-13 05:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:23.096415909 +0000 UTC m=+1001.542663402" watchObservedRunningTime="2025-06-13 05:07:23.101981666 +0000 UTC m=+1001.548229159" Jun 13 05:07:31 crc kubenswrapper[4894]: I0613 05:07:31.566174 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.148306 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9pcx8"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.149517 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.151487 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.151746 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.166660 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pcx8"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.234639 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzcjd\" (UniqueName: \"kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.234958 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.235164 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.235204 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.337764 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.337810 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.337851 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzcjd\" (UniqueName: \"kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.337957 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.347629 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.367493 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.369144 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.372283 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.379135 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.396540 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.400703 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.417285 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzcjd\" (UniqueName: \"kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd\") pod \"nova-cell0-cell-mapping-9pcx8\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.468536 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.476865 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.492245 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.493727 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.505028 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559255 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559324 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559379 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559402 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs529\" (UniqueName: \"kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559424 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559471 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2sg6\" (UniqueName: \"kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.559520 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.567242 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.568477 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.582214 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.641745 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660720 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660781 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660804 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs529\" (UniqueName: \"kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660823 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660872 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2sg6\" (UniqueName: \"kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660912 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.660929 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.664946 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.671359 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.681448 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.683257 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.690475 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.694863 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs529\" (UniqueName: \"kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529\") pod \"nova-api-0\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.750300 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2sg6\" (UniqueName: \"kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6\") pod \"nova-cell1-novncproxy-0\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.770975 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.772593 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.772884 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.773000 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9lj\" (UniqueName: \"kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.773041 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.781254 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.789747 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.790267 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874755 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874800 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874848 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9lj\" (UniqueName: \"kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874887 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874945 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.874968 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.875000 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjjrl\" (UniqueName: \"kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.880257 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.886180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.893831 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.920241 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9lj\" (UniqueName: \"kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj\") pod \"nova-scheduler-0\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.961467 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.962917 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.988588 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjjrl\" (UniqueName: \"kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.988701 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.988725 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.988790 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.989150 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:32 crc kubenswrapper[4894]: I0613 05:07:32.996565 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.006728 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.020137 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjjrl\" (UniqueName: \"kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.044370 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " pod="openstack/nova-metadata-0" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.091275 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bh8\" (UniqueName: \"kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.091353 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.091403 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.091440 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.091474 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.138231 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.206882 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.235091 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bh8\" (UniqueName: \"kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.235159 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.235192 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.235235 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.235259 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.236696 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.237234 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.237317 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.238100 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.263623 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bh8\" (UniqueName: \"kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8\") pod \"dnsmasq-dns-59dc98b7b9-ns5ql\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.332401 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.333395 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pcx8"] Jun 13 05:07:33 crc kubenswrapper[4894]: W0613 05:07:33.582970 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e47cf0a_af48_43d2_9771_f72c6ae4afa3.slice/crio-af2d68824f1f7a8a23f22e00e8e12229d3da004c162a9dd9634c591e3d60fd2f WatchSource:0}: Error finding container af2d68824f1f7a8a23f22e00e8e12229d3da004c162a9dd9634c591e3d60fd2f: Status 404 returned error can't find the container with id af2d68824f1f7a8a23f22e00e8e12229d3da004c162a9dd9634c591e3d60fd2f Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.587824 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.598630 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.738542 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.742138 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5lt6d"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.743301 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.748967 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.749109 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.759847 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5lt6d"] Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.857180 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.857289 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.857321 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.857378 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9p5\" (UniqueName: \"kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.898495 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:07:33 crc kubenswrapper[4894]: W0613 05:07:33.904773 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9da35fe_2b5c_4ec2_8241_e7fd4ba260ff.slice/crio-443b8f0201ef95004c770e52f6dd3fadcef7d44092eb8700d7e533eaf6d4f043 WatchSource:0}: Error finding container 443b8f0201ef95004c770e52f6dd3fadcef7d44092eb8700d7e533eaf6d4f043: Status 404 returned error can't find the container with id 443b8f0201ef95004c770e52f6dd3fadcef7d44092eb8700d7e533eaf6d4f043 Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.905768 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:33 crc kubenswrapper[4894]: W0613 05:07:33.906927 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode02bbad2_9647_47c7_96cb_762f64dd9232.slice/crio-f1b36dca2aeeea6198aded244a84256de8e74dfbea6cb9b230e9fd3a83e137ad WatchSource:0}: Error finding container f1b36dca2aeeea6198aded244a84256de8e74dfbea6cb9b230e9fd3a83e137ad: Status 404 returned error can't find the container with id f1b36dca2aeeea6198aded244a84256de8e74dfbea6cb9b230e9fd3a83e137ad Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.959893 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.959951 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.959982 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.960026 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9p5\" (UniqueName: \"kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.965448 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.965979 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.970018 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:33 crc kubenswrapper[4894]: I0613 05:07:33.982594 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9p5\" (UniqueName: \"kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5\") pod \"nova-cell1-conductor-db-sync-5lt6d\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.071423 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.248684 4894 generic.go:334] "Generic (PLEG): container finished" podID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerID="78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f" exitCode=0 Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.248738 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" event={"ID":"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff","Type":"ContainerDied","Data":"78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.248762 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" event={"ID":"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff","Type":"ContainerStarted","Data":"443b8f0201ef95004c770e52f6dd3fadcef7d44092eb8700d7e533eaf6d4f043"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.265038 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerStarted","Data":"3e6759455d5c22f53dd3f5fcd01474eef12bb1b12036be57bd999eb5e298eb46"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.266736 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pcx8" event={"ID":"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9","Type":"ContainerStarted","Data":"dbf9461b8c1db0f32df8cdd1d713125fe988052066556d56b5b06e1e2c1d948a"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.266787 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pcx8" event={"ID":"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9","Type":"ContainerStarted","Data":"5943aaf802ddbd68816f2375077d819b96bc09290b79aa6b785223d433df1faf"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.303018 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerStarted","Data":"131d3101d97540f3163cb90f01c7e050cf42d18b84e1faa18269fc5385b92e24"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.303060 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e47cf0a-af48-43d2-9771-f72c6ae4afa3","Type":"ContainerStarted","Data":"af2d68824f1f7a8a23f22e00e8e12229d3da004c162a9dd9634c591e3d60fd2f"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.305249 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e02bbad2-9647-47c7-96cb-762f64dd9232","Type":"ContainerStarted","Data":"f1b36dca2aeeea6198aded244a84256de8e74dfbea6cb9b230e9fd3a83e137ad"} Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.313158 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9pcx8" podStartSLOduration=2.313138822 podStartE2EDuration="2.313138822s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:34.310607731 +0000 UTC m=+1012.756855194" watchObservedRunningTime="2025-06-13 05:07:34.313138822 +0000 UTC m=+1012.759386285" Jun 13 05:07:34 crc kubenswrapper[4894]: I0613 05:07:34.383323 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5lt6d"] Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.321340 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" event={"ID":"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff","Type":"ContainerStarted","Data":"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314"} Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.322161 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.326171 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" event={"ID":"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e","Type":"ContainerStarted","Data":"60473c4bdbdb5c31f704e449ad05fc0f353847c193e5234c2a89c530ccc096ed"} Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.326205 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" event={"ID":"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e","Type":"ContainerStarted","Data":"030b960ff0c04b76d88b3f27a79f74846fd7eafa72197fd98bbbd6a774b5bb63"} Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.345198 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" podStartSLOduration=3.345183186 podStartE2EDuration="3.345183186s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:35.339538647 +0000 UTC m=+1013.785786130" watchObservedRunningTime="2025-06-13 05:07:35.345183186 +0000 UTC m=+1013.791430649" Jun 13 05:07:35 crc kubenswrapper[4894]: I0613 05:07:35.357505 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" podStartSLOduration=2.357491633 podStartE2EDuration="2.357491633s" podCreationTimestamp="2025-06-13 05:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:35.353092439 +0000 UTC m=+1013.799339902" watchObservedRunningTime="2025-06-13 05:07:35.357491633 +0000 UTC m=+1013.803739096" Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.348289 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.359971 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerStarted","Data":"22b252e98b49ff80c6f897b360a7754173beeda7a0548f9439054421c97cfb9d"} Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.385694 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerStarted","Data":"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd"} Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.388628 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e47cf0a-af48-43d2-9771-f72c6ae4afa3","Type":"ContainerStarted","Data":"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e"} Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.390571 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e02bbad2-9647-47c7-96cb-762f64dd9232","Type":"ContainerStarted","Data":"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50"} Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.433590 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.334103082 podStartE2EDuration="5.433575746s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="2025-06-13 05:07:33.6096315 +0000 UTC m=+1012.055878963" lastFinishedPulling="2025-06-13 05:07:36.709104164 +0000 UTC m=+1015.155351627" observedRunningTime="2025-06-13 05:07:37.422886525 +0000 UTC m=+1015.869133988" watchObservedRunningTime="2025-06-13 05:07:37.433575746 +0000 UTC m=+1015.879823209" Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.447259 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.646631383 podStartE2EDuration="5.447242961s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="2025-06-13 05:07:33.9086218 +0000 UTC m=+1012.354869263" lastFinishedPulling="2025-06-13 05:07:36.709233388 +0000 UTC m=+1015.155480841" observedRunningTime="2025-06-13 05:07:37.440421419 +0000 UTC m=+1015.886668872" watchObservedRunningTime="2025-06-13 05:07:37.447242961 +0000 UTC m=+1015.893490424" Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.585513 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.596874 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:37 crc kubenswrapper[4894]: I0613 05:07:37.791453 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.207603 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.399045 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerStarted","Data":"48f3a6bc7cef514282372f1c61d6c1eacc64873ad407e21a5266b39625c132a5"} Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.400841 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerStarted","Data":"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10"} Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.401017 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-log" containerID="cri-o://137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" gracePeriod=30 Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.401070 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-metadata" containerID="cri-o://3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" gracePeriod=30 Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.422280 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.297293237 podStartE2EDuration="6.422264939s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="2025-06-13 05:07:33.583308159 +0000 UTC m=+1012.029555622" lastFinishedPulling="2025-06-13 05:07:36.708279871 +0000 UTC m=+1015.154527324" observedRunningTime="2025-06-13 05:07:38.418954396 +0000 UTC m=+1016.865201859" watchObservedRunningTime="2025-06-13 05:07:38.422264939 +0000 UTC m=+1016.868512402" Jun 13 05:07:38 crc kubenswrapper[4894]: I0613 05:07:38.440266 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.473014896 podStartE2EDuration="6.440251026s" podCreationTimestamp="2025-06-13 05:07:32 +0000 UTC" firstStartedPulling="2025-06-13 05:07:33.74596733 +0000 UTC m=+1012.192214793" lastFinishedPulling="2025-06-13 05:07:36.71320346 +0000 UTC m=+1015.159450923" observedRunningTime="2025-06-13 05:07:38.43544569 +0000 UTC m=+1016.881693153" watchObservedRunningTime="2025-06-13 05:07:38.440251026 +0000 UTC m=+1016.886498489" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.003072 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.052400 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data\") pod \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.052441 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs\") pod \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.052487 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjjrl\" (UniqueName: \"kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl\") pod \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.052553 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle\") pod \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\" (UID: \"74806f1c-a9aa-4896-a90a-545f7cbfd4d5\") " Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.052803 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs" (OuterVolumeSpecName: "logs") pod "74806f1c-a9aa-4896-a90a-545f7cbfd4d5" (UID: "74806f1c-a9aa-4896-a90a-545f7cbfd4d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.053308 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.071821 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl" (OuterVolumeSpecName: "kube-api-access-cjjrl") pod "74806f1c-a9aa-4896-a90a-545f7cbfd4d5" (UID: "74806f1c-a9aa-4896-a90a-545f7cbfd4d5"). InnerVolumeSpecName "kube-api-access-cjjrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.097159 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data" (OuterVolumeSpecName: "config-data") pod "74806f1c-a9aa-4896-a90a-545f7cbfd4d5" (UID: "74806f1c-a9aa-4896-a90a-545f7cbfd4d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.099794 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74806f1c-a9aa-4896-a90a-545f7cbfd4d5" (UID: "74806f1c-a9aa-4896-a90a-545f7cbfd4d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.154426 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.154461 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjjrl\" (UniqueName: \"kubernetes.io/projected/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-kube-api-access-cjjrl\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.154473 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74806f1c-a9aa-4896-a90a-545f7cbfd4d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.409941 4894 generic.go:334] "Generic (PLEG): container finished" podID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerID="3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" exitCode=0 Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.409980 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerDied","Data":"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10"} Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.410062 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerDied","Data":"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd"} Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.410070 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.409992 4894 generic.go:334] "Generic (PLEG): container finished" podID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerID="137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" exitCode=143 Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.411431 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74806f1c-a9aa-4896-a90a-545f7cbfd4d5","Type":"ContainerDied","Data":"131d3101d97540f3163cb90f01c7e050cf42d18b84e1faa18269fc5385b92e24"} Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.410109 4894 scope.go:117] "RemoveContainer" containerID="3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.411744 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e" gracePeriod=30 Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.448291 4894 scope.go:117] "RemoveContainer" containerID="137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.464737 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.476068 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.480207 4894 scope.go:117] "RemoveContainer" containerID="3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" Jun 13 05:07:39 crc kubenswrapper[4894]: E0613 05:07:39.482176 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10\": container with ID starting with 3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10 not found: ID does not exist" containerID="3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.482227 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10"} err="failed to get container status \"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10\": rpc error: code = NotFound desc = could not find container \"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10\": container with ID starting with 3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10 not found: ID does not exist" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.482253 4894 scope.go:117] "RemoveContainer" containerID="137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" Jun 13 05:07:39 crc kubenswrapper[4894]: E0613 05:07:39.489327 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd\": container with ID starting with 137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd not found: ID does not exist" containerID="137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.489411 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd"} err="failed to get container status \"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd\": rpc error: code = NotFound desc = could not find container \"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd\": container with ID starting with 137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd not found: ID does not exist" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.489437 4894 scope.go:117] "RemoveContainer" containerID="3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.494962 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10"} err="failed to get container status \"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10\": rpc error: code = NotFound desc = could not find container \"3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10\": container with ID starting with 3f6a9263f596a9d3a52ccbcc5e94477b7fe1b212d9e167c93fd1ad6fc43b6a10 not found: ID does not exist" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.495128 4894 scope.go:117] "RemoveContainer" containerID="137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.498960 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd"} err="failed to get container status \"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd\": rpc error: code = NotFound desc = could not find container \"137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd\": container with ID starting with 137a56f5724e899796e2f328fca06222a89321fe62a5299ccb61c060bb5d9ebd not found: ID does not exist" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.508141 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:39 crc kubenswrapper[4894]: E0613 05:07:39.509484 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-metadata" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.519754 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-metadata" Jun 13 05:07:39 crc kubenswrapper[4894]: E0613 05:07:39.519832 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-log" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.519840 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-log" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.520440 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-metadata" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.520455 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" containerName="nova-metadata-log" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.528421 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.536901 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.537298 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.583886 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:39 crc kubenswrapper[4894]: E0613 05:07:39.636535 4894 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74806f1c_a9aa_4896_a90a_545f7cbfd4d5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74806f1c_a9aa_4896_a90a_545f7cbfd4d5.slice/crio-131d3101d97540f3163cb90f01c7e050cf42d18b84e1faa18269fc5385b92e24\": RecentStats: unable to find data in memory cache]" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.672909 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hdg\" (UniqueName: \"kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.673085 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.673112 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.673142 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.673228 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.775139 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.775185 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.775229 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.775253 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.775303 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hdg\" (UniqueName: \"kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.776118 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.779679 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.785671 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.786203 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.789762 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hdg\" (UniqueName: \"kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg\") pod \"nova-metadata-0\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " pod="openstack/nova-metadata-0" Jun 13 05:07:39 crc kubenswrapper[4894]: I0613 05:07:39.872905 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.245587 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.286826 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74806f1c-a9aa-4896-a90a-545f7cbfd4d5" path="/var/lib/kubelet/pods/74806f1c-a9aa-4896-a90a-545f7cbfd4d5/volumes" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.290520 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle\") pod \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.290555 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2sg6\" (UniqueName: \"kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6\") pod \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.290722 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data\") pod \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\" (UID: \"0e47cf0a-af48-43d2-9771-f72c6ae4afa3\") " Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.314946 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data" (OuterVolumeSpecName: "config-data") pod "0e47cf0a-af48-43d2-9771-f72c6ae4afa3" (UID: "0e47cf0a-af48-43d2-9771-f72c6ae4afa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.316524 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6" (OuterVolumeSpecName: "kube-api-access-z2sg6") pod "0e47cf0a-af48-43d2-9771-f72c6ae4afa3" (UID: "0e47cf0a-af48-43d2-9771-f72c6ae4afa3"). InnerVolumeSpecName "kube-api-access-z2sg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.316735 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e47cf0a-af48-43d2-9771-f72c6ae4afa3" (UID: "0e47cf0a-af48-43d2-9771-f72c6ae4afa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.387625 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.393289 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.393315 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.393327 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2sg6\" (UniqueName: \"kubernetes.io/projected/0e47cf0a-af48-43d2-9771-f72c6ae4afa3-kube-api-access-z2sg6\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:40 crc kubenswrapper[4894]: W0613 05:07:40.396382 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod141236d7_14d5_4d21_ac69_e52e582f87bf.slice/crio-0aa65ece8d43839aabf5bceabb4e6dbe7ceaf449cb35e37b06c9a838f40881af WatchSource:0}: Error finding container 0aa65ece8d43839aabf5bceabb4e6dbe7ceaf449cb35e37b06c9a838f40881af: Status 404 returned error can't find the container with id 0aa65ece8d43839aabf5bceabb4e6dbe7ceaf449cb35e37b06c9a838f40881af Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.420626 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerStarted","Data":"0aa65ece8d43839aabf5bceabb4e6dbe7ceaf449cb35e37b06c9a838f40881af"} Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.422995 4894 generic.go:334] "Generic (PLEG): container finished" podID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" containerID="7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e" exitCode=0 Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.423028 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e47cf0a-af48-43d2-9771-f72c6ae4afa3","Type":"ContainerDied","Data":"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e"} Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.423044 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.423058 4894 scope.go:117] "RemoveContainer" containerID="7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.423047 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0e47cf0a-af48-43d2-9771-f72c6ae4afa3","Type":"ContainerDied","Data":"af2d68824f1f7a8a23f22e00e8e12229d3da004c162a9dd9634c591e3d60fd2f"} Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.447767 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.450090 4894 scope.go:117] "RemoveContainer" containerID="7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e" Jun 13 05:07:40 crc kubenswrapper[4894]: E0613 05:07:40.450443 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e\": container with ID starting with 7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e not found: ID does not exist" containerID="7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.450483 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e"} err="failed to get container status \"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e\": rpc error: code = NotFound desc = could not find container \"7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e\": container with ID starting with 7b2f9a21dc3ebf6dd14f30b9e33fa9390fa35510cf9216548ad0df8f7c80145e not found: ID does not exist" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.457481 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.465582 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:40 crc kubenswrapper[4894]: E0613 05:07:40.465919 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" containerName="nova-cell1-novncproxy-novncproxy" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.465931 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" containerName="nova-cell1-novncproxy-novncproxy" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.466124 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" containerName="nova-cell1-novncproxy-novncproxy" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.466671 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.469212 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.472711 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.475516 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.478309 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.495006 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.495073 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.495171 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.495215 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.495229 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmjts\" (UniqueName: \"kubernetes.io/projected/088bef64-0dd7-48f3-9977-6fc21e24686a-kube-api-access-rmjts\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.596501 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.596559 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.596576 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmjts\" (UniqueName: \"kubernetes.io/projected/088bef64-0dd7-48f3-9977-6fc21e24686a-kube-api-access-rmjts\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.596607 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.596647 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.600363 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.600768 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.603939 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.606247 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/088bef64-0dd7-48f3-9977-6fc21e24686a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.612307 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmjts\" (UniqueName: \"kubernetes.io/projected/088bef64-0dd7-48f3-9977-6fc21e24686a-kube-api-access-rmjts\") pod \"nova-cell1-novncproxy-0\" (UID: \"088bef64-0dd7-48f3-9977-6fc21e24686a\") " pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:40 crc kubenswrapper[4894]: I0613 05:07:40.812771 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.320616 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jun 13 05:07:41 crc kubenswrapper[4894]: W0613 05:07:41.326680 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod088bef64_0dd7_48f3_9977_6fc21e24686a.slice/crio-96f92d328522ac0323dcc1ff9dc5fd951b6005e75ac5a9bfa415a8afcc1fe355 WatchSource:0}: Error finding container 96f92d328522ac0323dcc1ff9dc5fd951b6005e75ac5a9bfa415a8afcc1fe355: Status 404 returned error can't find the container with id 96f92d328522ac0323dcc1ff9dc5fd951b6005e75ac5a9bfa415a8afcc1fe355 Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.437195 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerStarted","Data":"c5046dd3300c357196eaeb883000cb9f036dbb57afc6398a1669eab02f7fce2e"} Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.437255 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerStarted","Data":"48671de0ddbc7b294120d662922bc8ab50c947475b5e987628a737726e666dab"} Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.453043 4894 generic.go:334] "Generic (PLEG): container finished" podID="f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" containerID="dbf9461b8c1db0f32df8cdd1d713125fe988052066556d56b5b06e1e2c1d948a" exitCode=0 Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.453240 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pcx8" event={"ID":"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9","Type":"ContainerDied","Data":"dbf9461b8c1db0f32df8cdd1d713125fe988052066556d56b5b06e1e2c1d948a"} Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.459680 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"088bef64-0dd7-48f3-9977-6fc21e24686a","Type":"ContainerStarted","Data":"96f92d328522ac0323dcc1ff9dc5fd951b6005e75ac5a9bfa415a8afcc1fe355"} Jun 13 05:07:41 crc kubenswrapper[4894]: I0613 05:07:41.465354 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.465343745 podStartE2EDuration="2.465343745s" podCreationTimestamp="2025-06-13 05:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:41.460418836 +0000 UTC m=+1019.906666299" watchObservedRunningTime="2025-06-13 05:07:41.465343745 +0000 UTC m=+1019.911591208" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.293991 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e47cf0a-af48-43d2-9771-f72c6ae4afa3" path="/var/lib/kubelet/pods/0e47cf0a-af48-43d2-9771-f72c6ae4afa3/volumes" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.473298 4894 generic.go:334] "Generic (PLEG): container finished" podID="ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" containerID="60473c4bdbdb5c31f704e449ad05fc0f353847c193e5234c2a89c530ccc096ed" exitCode=0 Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.473450 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" event={"ID":"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e","Type":"ContainerDied","Data":"60473c4bdbdb5c31f704e449ad05fc0f353847c193e5234c2a89c530ccc096ed"} Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.478930 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"088bef64-0dd7-48f3-9977-6fc21e24686a","Type":"ContainerStarted","Data":"1bfdc36dff98ee53136e03a785a63fea1e3e2339df8fb8e9698b873a6fcfafeb"} Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.552436 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.552417838 podStartE2EDuration="2.552417838s" podCreationTimestamp="2025-06-13 05:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:42.53294996 +0000 UTC m=+1020.979197463" watchObservedRunningTime="2025-06-13 05:07:42.552417838 +0000 UTC m=+1020.998665311" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.781947 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.782030 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.895691 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.952186 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts\") pod \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.952277 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data\") pod \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.952340 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzcjd\" (UniqueName: \"kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd\") pod \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.952371 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle\") pod \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\" (UID: \"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9\") " Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.971167 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts" (OuterVolumeSpecName: "scripts") pod "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" (UID: "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.973028 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd" (OuterVolumeSpecName: "kube-api-access-gzcjd") pod "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" (UID: "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9"). InnerVolumeSpecName "kube-api-access-gzcjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.982273 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" (UID: "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:42 crc kubenswrapper[4894]: I0613 05:07:42.988732 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data" (OuterVolumeSpecName: "config-data") pod "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" (UID: "f7a969ac-8f70-4cd4-bf65-75a10c7f14e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.054176 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzcjd\" (UniqueName: \"kubernetes.io/projected/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-kube-api-access-gzcjd\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.054424 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.054435 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.054445 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.207072 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.233329 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.334437 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.421537 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.441006 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="dnsmasq-dns" containerID="cri-o://60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e" gracePeriod=10 Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.495028 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pcx8" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.495379 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pcx8" event={"ID":"f7a969ac-8f70-4cd4-bf65-75a10c7f14e9","Type":"ContainerDied","Data":"5943aaf802ddbd68816f2375077d819b96bc09290b79aa6b785223d433df1faf"} Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.495413 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5943aaf802ddbd68816f2375077d819b96bc09290b79aa6b785223d433df1faf" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.559123 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.825540 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.825792 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-log" containerID="cri-o://22b252e98b49ff80c6f897b360a7754173beeda7a0548f9439054421c97cfb9d" gracePeriod=30 Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.826155 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-api" containerID="cri-o://48f3a6bc7cef514282372f1c61d6c1eacc64873ad407e21a5266b39625c132a5" gracePeriod=30 Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.869795 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.870160 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.933250 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.933455 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-log" containerID="cri-o://48671de0ddbc7b294120d662922bc8ab50c947475b5e987628a737726e666dab" gracePeriod=30 Jun 13 05:07:43 crc kubenswrapper[4894]: I0613 05:07:43.933807 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-metadata" containerID="cri-o://c5046dd3300c357196eaeb883000cb9f036dbb57afc6398a1669eab02f7fce2e" gracePeriod=30 Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.142521 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.195302 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts\") pod \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.195521 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx9p5\" (UniqueName: \"kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5\") pod \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.195549 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data\") pod \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.195606 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle\") pod \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\" (UID: \"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.219396 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5" (OuterVolumeSpecName: "kube-api-access-dx9p5") pod "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" (UID: "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e"). InnerVolumeSpecName "kube-api-access-dx9p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.222948 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts" (OuterVolumeSpecName: "scripts") pod "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" (UID: "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.249283 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data" (OuterVolumeSpecName: "config-data") pod "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" (UID: "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.252597 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" (UID: "ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.300645 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.300732 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.300743 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx9p5\" (UniqueName: \"kubernetes.io/projected/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-kube-api-access-dx9p5\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.300754 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.347297 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.403210 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb\") pod \"9f9b589d-4cbe-4858-986e-f7f9ec698137\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.403301 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb\") pod \"9f9b589d-4cbe-4858-986e-f7f9ec698137\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.403360 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config\") pod \"9f9b589d-4cbe-4858-986e-f7f9ec698137\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.403422 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnq29\" (UniqueName: \"kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29\") pod \"9f9b589d-4cbe-4858-986e-f7f9ec698137\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.403494 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc\") pod \"9f9b589d-4cbe-4858-986e-f7f9ec698137\" (UID: \"9f9b589d-4cbe-4858-986e-f7f9ec698137\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.416776 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29" (OuterVolumeSpecName: "kube-api-access-cnq29") pod "9f9b589d-4cbe-4858-986e-f7f9ec698137" (UID: "9f9b589d-4cbe-4858-986e-f7f9ec698137"). InnerVolumeSpecName "kube-api-access-cnq29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.473203 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f9b589d-4cbe-4858-986e-f7f9ec698137" (UID: "9f9b589d-4cbe-4858-986e-f7f9ec698137"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.488366 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config" (OuterVolumeSpecName: "config") pod "9f9b589d-4cbe-4858-986e-f7f9ec698137" (UID: "9f9b589d-4cbe-4858-986e-f7f9ec698137"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.488704 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f9b589d-4cbe-4858-986e-f7f9ec698137" (UID: "9f9b589d-4cbe-4858-986e-f7f9ec698137"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.491753 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f9b589d-4cbe-4858-986e-f7f9ec698137" (UID: "9f9b589d-4cbe-4858-986e-f7f9ec698137"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.504121 4894 generic.go:334] "Generic (PLEG): container finished" podID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerID="c5046dd3300c357196eaeb883000cb9f036dbb57afc6398a1669eab02f7fce2e" exitCode=0 Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.504150 4894 generic.go:334] "Generic (PLEG): container finished" podID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerID="48671de0ddbc7b294120d662922bc8ab50c947475b5e987628a737726e666dab" exitCode=143 Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.504187 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerDied","Data":"c5046dd3300c357196eaeb883000cb9f036dbb57afc6398a1669eab02f7fce2e"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.504213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerDied","Data":"48671de0ddbc7b294120d662922bc8ab50c947475b5e987628a737726e666dab"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.505230 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" event={"ID":"ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e","Type":"ContainerDied","Data":"030b960ff0c04b76d88b3f27a79f74846fd7eafa72197fd98bbbd6a774b5bb63"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.505254 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030b960ff0c04b76d88b3f27a79f74846fd7eafa72197fd98bbbd6a774b5bb63" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.505316 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5lt6d" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.506850 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.506876 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.506886 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.506895 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9b589d-4cbe-4858-986e-f7f9ec698137-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.506911 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnq29\" (UniqueName: \"kubernetes.io/projected/9f9b589d-4cbe-4858-986e-f7f9ec698137-kube-api-access-cnq29\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.510375 4894 generic.go:334] "Generic (PLEG): container finished" podID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerID="60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e" exitCode=0 Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.510434 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" event={"ID":"9f9b589d-4cbe-4858-986e-f7f9ec698137","Type":"ContainerDied","Data":"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.510473 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" event={"ID":"9f9b589d-4cbe-4858-986e-f7f9ec698137","Type":"ContainerDied","Data":"e328848ab9a325aa86f462cf2e4fc0db69228931089734f98de02fb386dc665a"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.510489 4894 scope.go:117] "RemoveContainer" containerID="60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.510587 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75cc475fb9-tpsp4" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.519294 4894 generic.go:334] "Generic (PLEG): container finished" podID="7357459b-96d2-4e31-8a75-a772df0ec307" containerID="22b252e98b49ff80c6f897b360a7754173beeda7a0548f9439054421c97cfb9d" exitCode=143 Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.519484 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerDied","Data":"22b252e98b49ff80c6f897b360a7754173beeda7a0548f9439054421c97cfb9d"} Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.539398 4894 scope.go:117] "RemoveContainer" containerID="9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.552758 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.563342 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75cc475fb9-tpsp4"] Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.563890 4894 scope.go:117] "RemoveContainer" containerID="60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e" Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.564840 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e\": container with ID starting with 60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e not found: ID does not exist" containerID="60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.564871 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e"} err="failed to get container status \"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e\": rpc error: code = NotFound desc = could not find container \"60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e\": container with ID starting with 60a8bb37c9247b0a2db950972823ba2e10e0dbbc49e7907f0096f3211e25904e not found: ID does not exist" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.564896 4894 scope.go:117] "RemoveContainer" containerID="9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85" Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.568035 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85\": container with ID starting with 9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85 not found: ID does not exist" containerID="9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.568060 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85"} err="failed to get container status \"9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85\": rpc error: code = NotFound desc = could not find container \"9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85\": container with ID starting with 9e3fc59b37237acbbe79ba329d533e4a26f47c997ba7689947efbdebb3350c85 not found: ID does not exist" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.572915 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.601606 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.608905 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" containerName="nova-manage" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.608933 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" containerName="nova-manage" Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.608956 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" containerName="nova-cell1-conductor-db-sync" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.608962 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" containerName="nova-cell1-conductor-db-sync" Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.608982 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="init" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.608987 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="init" Jun 13 05:07:44 crc kubenswrapper[4894]: E0613 05:07:44.608997 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="dnsmasq-dns" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.609004 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="dnsmasq-dns" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.609175 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" containerName="dnsmasq-dns" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.609197 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" containerName="nova-manage" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.609205 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" containerName="nova-cell1-conductor-db-sync" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.609792 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.614548 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.626343 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.709732 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.709971 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.710075 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8x74\" (UniqueName: \"kubernetes.io/projected/c7fe666a-475a-4f55-9edb-4f19b5b87f73-kube-api-access-q8x74\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.767054 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811003 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95hdg\" (UniqueName: \"kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg\") pod \"141236d7-14d5-4d21-ac69-e52e582f87bf\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811141 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle\") pod \"141236d7-14d5-4d21-ac69-e52e582f87bf\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811281 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data\") pod \"141236d7-14d5-4d21-ac69-e52e582f87bf\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811526 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811559 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8x74\" (UniqueName: \"kubernetes.io/projected/c7fe666a-475a-4f55-9edb-4f19b5b87f73-kube-api-access-q8x74\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.811635 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.824310 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.824520 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg" (OuterVolumeSpecName: "kube-api-access-95hdg") pod "141236d7-14d5-4d21-ac69-e52e582f87bf" (UID: "141236d7-14d5-4d21-ac69-e52e582f87bf"). InnerVolumeSpecName "kube-api-access-95hdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.827864 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8x74\" (UniqueName: \"kubernetes.io/projected/c7fe666a-475a-4f55-9edb-4f19b5b87f73-kube-api-access-q8x74\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.832292 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe666a-475a-4f55-9edb-4f19b5b87f73-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7fe666a-475a-4f55-9edb-4f19b5b87f73\") " pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.837948 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "141236d7-14d5-4d21-ac69-e52e582f87bf" (UID: "141236d7-14d5-4d21-ac69-e52e582f87bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.848981 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data" (OuterVolumeSpecName: "config-data") pod "141236d7-14d5-4d21-ac69-e52e582f87bf" (UID: "141236d7-14d5-4d21-ac69-e52e582f87bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.912459 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs\") pod \"141236d7-14d5-4d21-ac69-e52e582f87bf\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.912516 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs\") pod \"141236d7-14d5-4d21-ac69-e52e582f87bf\" (UID: \"141236d7-14d5-4d21-ac69-e52e582f87bf\") " Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.912802 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.912813 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95hdg\" (UniqueName: \"kubernetes.io/projected/141236d7-14d5-4d21-ac69-e52e582f87bf-kube-api-access-95hdg\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.912823 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.913312 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs" (OuterVolumeSpecName: "logs") pod "141236d7-14d5-4d21-ac69-e52e582f87bf" (UID: "141236d7-14d5-4d21-ac69-e52e582f87bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.923537 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:44 crc kubenswrapper[4894]: I0613 05:07:44.964395 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "141236d7-14d5-4d21-ac69-e52e582f87bf" (UID: "141236d7-14d5-4d21-ac69-e52e582f87bf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.015015 4894 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/141236d7-14d5-4d21-ac69-e52e582f87bf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.015048 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141236d7-14d5-4d21-ac69-e52e582f87bf-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.371435 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jun 13 05:07:45 crc kubenswrapper[4894]: W0613 05:07:45.386763 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fe666a_475a_4f55_9edb_4f19b5b87f73.slice/crio-5d5de2e9fe736a96218add0122d40114097843b94d26a4d981d4757d8126b0d0 WatchSource:0}: Error finding container 5d5de2e9fe736a96218add0122d40114097843b94d26a4d981d4757d8126b0d0: Status 404 returned error can't find the container with id 5d5de2e9fe736a96218add0122d40114097843b94d26a4d981d4757d8126b0d0 Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.535340 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7fe666a-475a-4f55-9edb-4f19b5b87f73","Type":"ContainerStarted","Data":"5d5de2e9fe736a96218add0122d40114097843b94d26a4d981d4757d8126b0d0"} Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.542290 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"141236d7-14d5-4d21-ac69-e52e582f87bf","Type":"ContainerDied","Data":"0aa65ece8d43839aabf5bceabb4e6dbe7ceaf449cb35e37b06c9a838f40881af"} Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.542359 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerName="nova-scheduler-scheduler" containerID="cri-o://c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" gracePeriod=30 Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.542532 4894 scope.go:117] "RemoveContainer" containerID="c5046dd3300c357196eaeb883000cb9f036dbb57afc6398a1669eab02f7fce2e" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.542297 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.573555 4894 scope.go:117] "RemoveContainer" containerID="48671de0ddbc7b294120d662922bc8ab50c947475b5e987628a737726e666dab" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.577976 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.586096 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.595041 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:45 crc kubenswrapper[4894]: E0613 05:07:45.595450 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-log" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.595517 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-log" Jun 13 05:07:45 crc kubenswrapper[4894]: E0613 05:07:45.595586 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-metadata" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.595636 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-metadata" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.595863 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-log" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.595939 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" containerName="nova-metadata-metadata" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.596843 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.599217 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.599219 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.618702 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.730802 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgflv\" (UniqueName: \"kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.730866 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.730892 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.730914 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.730933 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.813342 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.832267 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgflv\" (UniqueName: \"kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.832391 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.832448 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.832506 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.832537 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.833164 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.836813 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.837407 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.838243 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.865377 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgflv\" (UniqueName: \"kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv\") pod \"nova-metadata-0\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " pod="openstack/nova-metadata-0" Jun 13 05:07:45 crc kubenswrapper[4894]: I0613 05:07:45.916729 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.287647 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="141236d7-14d5-4d21-ac69-e52e582f87bf" path="/var/lib/kubelet/pods/141236d7-14d5-4d21-ac69-e52e582f87bf/volumes" Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.289126 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9b589d-4cbe-4858-986e-f7f9ec698137" path="/var/lib/kubelet/pods/9f9b589d-4cbe-4858-986e-f7f9ec698137/volumes" Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.401986 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.569466 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7fe666a-475a-4f55-9edb-4f19b5b87f73","Type":"ContainerStarted","Data":"7f94189edadb28f0926b8e48dd07ee065a8282687da04723ccb4379bd1fb05b4"} Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.570168 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.578615 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerStarted","Data":"ae0846eedebf891117c8abb8837a1b0d5a2d470206b2ed1a8148fd2181cc2a89"} Jun 13 05:07:46 crc kubenswrapper[4894]: I0613 05:07:46.590838 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.590820772 podStartE2EDuration="2.590820772s" podCreationTimestamp="2025-06-13 05:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:46.589316879 +0000 UTC m=+1025.035564352" watchObservedRunningTime="2025-06-13 05:07:46.590820772 +0000 UTC m=+1025.037068245" Jun 13 05:07:47 crc kubenswrapper[4894]: I0613 05:07:47.593061 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerStarted","Data":"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a"} Jun 13 05:07:47 crc kubenswrapper[4894]: I0613 05:07:47.593396 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerStarted","Data":"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3"} Jun 13 05:07:47 crc kubenswrapper[4894]: I0613 05:07:47.619127 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.619110989 podStartE2EDuration="2.619110989s" podCreationTimestamp="2025-06-13 05:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:47.612536984 +0000 UTC m=+1026.058784457" watchObservedRunningTime="2025-06-13 05:07:47.619110989 +0000 UTC m=+1026.065358462" Jun 13 05:07:48 crc kubenswrapper[4894]: E0613 05:07:48.210022 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:07:48 crc kubenswrapper[4894]: E0613 05:07:48.211378 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:07:48 crc kubenswrapper[4894]: E0613 05:07:48.212871 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:07:48 crc kubenswrapper[4894]: E0613 05:07:48.212969 4894 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerName="nova-scheduler-scheduler" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.503479 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.612384 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data\") pod \"e02bbad2-9647-47c7-96cb-762f64dd9232\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.612444 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv9lj\" (UniqueName: \"kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj\") pod \"e02bbad2-9647-47c7-96cb-762f64dd9232\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.612591 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle\") pod \"e02bbad2-9647-47c7-96cb-762f64dd9232\" (UID: \"e02bbad2-9647-47c7-96cb-762f64dd9232\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.619409 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj" (OuterVolumeSpecName: "kube-api-access-rv9lj") pod "e02bbad2-9647-47c7-96cb-762f64dd9232" (UID: "e02bbad2-9647-47c7-96cb-762f64dd9232"). InnerVolumeSpecName "kube-api-access-rv9lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.625479 4894 generic.go:334] "Generic (PLEG): container finished" podID="7357459b-96d2-4e31-8a75-a772df0ec307" containerID="48f3a6bc7cef514282372f1c61d6c1eacc64873ad407e21a5266b39625c132a5" exitCode=0 Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.625538 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerDied","Data":"48f3a6bc7cef514282372f1c61d6c1eacc64873ad407e21a5266b39625c132a5"} Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.627056 4894 generic.go:334] "Generic (PLEG): container finished" podID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" exitCode=0 Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.627359 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e02bbad2-9647-47c7-96cb-762f64dd9232","Type":"ContainerDied","Data":"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50"} Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.627378 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e02bbad2-9647-47c7-96cb-762f64dd9232","Type":"ContainerDied","Data":"f1b36dca2aeeea6198aded244a84256de8e74dfbea6cb9b230e9fd3a83e137ad"} Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.627398 4894 scope.go:117] "RemoveContainer" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.627644 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.654813 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e02bbad2-9647-47c7-96cb-762f64dd9232" (UID: "e02bbad2-9647-47c7-96cb-762f64dd9232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.686818 4894 scope.go:117] "RemoveContainer" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" Jun 13 05:07:49 crc kubenswrapper[4894]: E0613 05:07:49.687272 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50\": container with ID starting with c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50 not found: ID does not exist" containerID="c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.687303 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50"} err="failed to get container status \"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50\": rpc error: code = NotFound desc = could not find container \"c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50\": container with ID starting with c661bdf035327d3cf45dc8bd655c740cb02d21baf9357e7f489efe3028573e50 not found: ID does not exist" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.688122 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data" (OuterVolumeSpecName: "config-data") pod "e02bbad2-9647-47c7-96cb-762f64dd9232" (UID: "e02bbad2-9647-47c7-96cb-762f64dd9232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.714835 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.714881 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02bbad2-9647-47c7-96cb-762f64dd9232-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.714892 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv9lj\" (UniqueName: \"kubernetes.io/projected/e02bbad2-9647-47c7-96cb-762f64dd9232-kube-api-access-rv9lj\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.804666 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.918237 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs529\" (UniqueName: \"kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529\") pod \"7357459b-96d2-4e31-8a75-a772df0ec307\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.918717 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data\") pod \"7357459b-96d2-4e31-8a75-a772df0ec307\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.918822 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs\") pod \"7357459b-96d2-4e31-8a75-a772df0ec307\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.918906 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle\") pod \"7357459b-96d2-4e31-8a75-a772df0ec307\" (UID: \"7357459b-96d2-4e31-8a75-a772df0ec307\") " Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.919849 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs" (OuterVolumeSpecName: "logs") pod "7357459b-96d2-4e31-8a75-a772df0ec307" (UID: "7357459b-96d2-4e31-8a75-a772df0ec307"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.921577 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529" (OuterVolumeSpecName: "kube-api-access-bs529") pod "7357459b-96d2-4e31-8a75-a772df0ec307" (UID: "7357459b-96d2-4e31-8a75-a772df0ec307"). InnerVolumeSpecName "kube-api-access-bs529". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.946294 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7357459b-96d2-4e31-8a75-a772df0ec307" (UID: "7357459b-96d2-4e31-8a75-a772df0ec307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:49 crc kubenswrapper[4894]: I0613 05:07:49.949941 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data" (OuterVolumeSpecName: "config-data") pod "7357459b-96d2-4e31-8a75-a772df0ec307" (UID: "7357459b-96d2-4e31-8a75-a772df0ec307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.015830 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.020664 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs529\" (UniqueName: \"kubernetes.io/projected/7357459b-96d2-4e31-8a75-a772df0ec307-kube-api-access-bs529\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.020690 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.020700 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7357459b-96d2-4e31-8a75-a772df0ec307-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.020708 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357459b-96d2-4e31-8a75-a772df0ec307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.024237 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038218 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: E0613 05:07:50.038533 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerName="nova-scheduler-scheduler" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038548 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerName="nova-scheduler-scheduler" Jun 13 05:07:50 crc kubenswrapper[4894]: E0613 05:07:50.038568 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-api" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038575 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-api" Jun 13 05:07:50 crc kubenswrapper[4894]: E0613 05:07:50.038606 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-log" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038612 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-log" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038767 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-log" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038786 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" containerName="nova-scheduler-scheduler" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.038805 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" containerName="nova-api-api" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.039338 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.041274 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.055995 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.122139 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6vn\" (UniqueName: \"kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.122219 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.122344 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.224286 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.224528 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.224701 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6vn\" (UniqueName: \"kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.230542 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.238232 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.240821 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6vn\" (UniqueName: \"kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn\") pod \"nova-scheduler-0\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.286837 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02bbad2-9647-47c7-96cb-762f64dd9232" path="/var/lib/kubelet/pods/e02bbad2-9647-47c7-96cb-762f64dd9232/volumes" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.358676 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.644170 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7357459b-96d2-4e31-8a75-a772df0ec307","Type":"ContainerDied","Data":"3e6759455d5c22f53dd3f5fcd01474eef12bb1b12036be57bd999eb5e298eb46"} Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.644502 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.644530 4894 scope.go:117] "RemoveContainer" containerID="48f3a6bc7cef514282372f1c61d6c1eacc64873ad407e21a5266b39625c132a5" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.672770 4894 scope.go:117] "RemoveContainer" containerID="22b252e98b49ff80c6f897b360a7754173beeda7a0548f9439054421c97cfb9d" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.676376 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.687404 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.705373 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.706952 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.711625 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.712579 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.778547 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:07:50 crc kubenswrapper[4894]: W0613 05:07:50.783128 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb93028dd_00f4_4cb9_821b_fd722a7e1779.slice/crio-6160c1975e531821165944b8b1500a611f850a8cda4eeae892302c49b4f026cd WatchSource:0}: Error finding container 6160c1975e531821165944b8b1500a611f850a8cda4eeae892302c49b4f026cd: Status 404 returned error can't find the container with id 6160c1975e531821165944b8b1500a611f850a8cda4eeae892302c49b4f026cd Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.813785 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.833492 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.833539 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.833558 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np72j\" (UniqueName: \"kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.833775 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.848538 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.916817 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.916862 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.935504 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.935542 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.935563 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np72j\" (UniqueName: \"kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.935599 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.936392 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.940811 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.943278 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:50 crc kubenswrapper[4894]: I0613 05:07:50.972374 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np72j\" (UniqueName: \"kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j\") pod \"nova-api-0\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " pod="openstack/nova-api-0" Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.034939 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.509181 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.661558 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerStarted","Data":"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234"} Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.661600 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerStarted","Data":"b3d792499820c278d56a155589931b8329e6bf574551e089a35792140fdd7496"} Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.663607 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b93028dd-00f4-4cb9-821b-fd722a7e1779","Type":"ContainerStarted","Data":"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6"} Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.663698 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b93028dd-00f4-4cb9-821b-fd722a7e1779","Type":"ContainerStarted","Data":"6160c1975e531821165944b8b1500a611f850a8cda4eeae892302c49b4f026cd"} Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.686935 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.686920072 podStartE2EDuration="1.686920072s" podCreationTimestamp="2025-06-13 05:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:51.680018698 +0000 UTC m=+1030.126266171" watchObservedRunningTime="2025-06-13 05:07:51.686920072 +0000 UTC m=+1030.133167535" Jun 13 05:07:51 crc kubenswrapper[4894]: I0613 05:07:51.687947 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jun 13 05:07:52 crc kubenswrapper[4894]: I0613 05:07:52.287862 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7357459b-96d2-4e31-8a75-a772df0ec307" path="/var/lib/kubelet/pods/7357459b-96d2-4e31-8a75-a772df0ec307/volumes" Jun 13 05:07:52 crc kubenswrapper[4894]: I0613 05:07:52.676548 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerStarted","Data":"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56"} Jun 13 05:07:52 crc kubenswrapper[4894]: I0613 05:07:52.697751 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.697731838 podStartE2EDuration="2.697731838s" podCreationTimestamp="2025-06-13 05:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:52.692166331 +0000 UTC m=+1031.138413794" watchObservedRunningTime="2025-06-13 05:07:52.697731838 +0000 UTC m=+1031.143979311" Jun 13 05:07:54 crc kubenswrapper[4894]: I0613 05:07:54.987000 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.359573 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.649239 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8vnlc"] Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.650274 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.651997 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.655016 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.672055 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vnlc"] Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.732557 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.732668 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.733038 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.733078 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8p8t\" (UniqueName: \"kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.835099 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.835166 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8p8t\" (UniqueName: \"kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.835205 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.835233 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.841033 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.843425 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.843483 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.859744 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8p8t\" (UniqueName: \"kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t\") pod \"nova-cell1-cell-mapping-8vnlc\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.917749 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.917798 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jun 13 05:07:55 crc kubenswrapper[4894]: I0613 05:07:55.968515 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.459595 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vnlc"] Jun 13 05:07:56 crc kubenswrapper[4894]: W0613 05:07:56.468859 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c9d623_ef25_45c6_a9c8_f34f64c922cd.slice/crio-b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc WatchSource:0}: Error finding container b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc: Status 404 returned error can't find the container with id b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.718425 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vnlc" event={"ID":"f9c9d623-ef25-45c6-a9c8-f34f64c922cd","Type":"ContainerStarted","Data":"784e390bdb1c8752660f1f76613cd6f9e1b504aa70399033def4b41a7426f163"} Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.718780 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vnlc" event={"ID":"f9c9d623-ef25-45c6-a9c8-f34f64c922cd","Type":"ContainerStarted","Data":"b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc"} Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.738242 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8vnlc" podStartSLOduration=1.7382251119999999 podStartE2EDuration="1.738225112s" podCreationTimestamp="2025-06-13 05:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:07:56.737334746 +0000 UTC m=+1035.183582209" watchObservedRunningTime="2025-06-13 05:07:56.738225112 +0000 UTC m=+1035.184472585" Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.929879 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:07:56 crc kubenswrapper[4894]: I0613 05:07:56.929879 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:00 crc kubenswrapper[4894]: I0613 05:08:00.359232 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jun 13 05:08:00 crc kubenswrapper[4894]: I0613 05:08:00.397209 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jun 13 05:08:00 crc kubenswrapper[4894]: I0613 05:08:00.793599 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.036471 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.036516 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.401827 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-svf7z"] Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.402851 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.406340 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.431067 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.431239 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzk9l\" (UniqueName: \"kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.532557 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzk9l\" (UniqueName: \"kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.532643 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.532777 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.560950 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzk9l\" (UniqueName: \"kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l\") pod \"crc-debug-svf7z\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.719113 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-svf7z" Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.772635 4894 generic.go:334] "Generic (PLEG): container finished" podID="f9c9d623-ef25-45c6-a9c8-f34f64c922cd" containerID="784e390bdb1c8752660f1f76613cd6f9e1b504aa70399033def4b41a7426f163" exitCode=0 Jun 13 05:08:01 crc kubenswrapper[4894]: I0613 05:08:01.772759 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vnlc" event={"ID":"f9c9d623-ef25-45c6-a9c8-f34f64c922cd","Type":"ContainerDied","Data":"784e390bdb1c8752660f1f76613cd6f9e1b504aa70399033def4b41a7426f163"} Jun 13 05:08:02 crc kubenswrapper[4894]: I0613 05:08:02.118860 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:02 crc kubenswrapper[4894]: I0613 05:08:02.118876 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:02 crc kubenswrapper[4894]: I0613 05:08:02.785022 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-svf7z" event={"ID":"f8b99387-e228-48fe-96e5-f6037cc54933","Type":"ContainerStarted","Data":"cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6"} Jun 13 05:08:02 crc kubenswrapper[4894]: I0613 05:08:02.785079 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-svf7z" event={"ID":"f8b99387-e228-48fe-96e5-f6037cc54933","Type":"ContainerStarted","Data":"11686f821d2642af8ea99fa8156680988e05443f02a6dd480ad3083d00bb37e8"} Jun 13 05:08:02 crc kubenswrapper[4894]: I0613 05:08:02.823357 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-svf7z" podStartSLOduration=1.823335498 podStartE2EDuration="1.823335498s" podCreationTimestamp="2025-06-13 05:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:02.816648078 +0000 UTC m=+1041.262895541" watchObservedRunningTime="2025-06-13 05:08:02.823335498 +0000 UTC m=+1041.269582991" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.237551 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.366406 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts\") pod \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.366566 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle\") pod \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.366593 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data\") pod \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.366615 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8p8t\" (UniqueName: \"kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t\") pod \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\" (UID: \"f9c9d623-ef25-45c6-a9c8-f34f64c922cd\") " Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.375809 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t" (OuterVolumeSpecName: "kube-api-access-l8p8t") pod "f9c9d623-ef25-45c6-a9c8-f34f64c922cd" (UID: "f9c9d623-ef25-45c6-a9c8-f34f64c922cd"). InnerVolumeSpecName "kube-api-access-l8p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.398811 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts" (OuterVolumeSpecName: "scripts") pod "f9c9d623-ef25-45c6-a9c8-f34f64c922cd" (UID: "f9c9d623-ef25-45c6-a9c8-f34f64c922cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.409477 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9c9d623-ef25-45c6-a9c8-f34f64c922cd" (UID: "f9c9d623-ef25-45c6-a9c8-f34f64c922cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.420177 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data" (OuterVolumeSpecName: "config-data") pod "f9c9d623-ef25-45c6-a9c8-f34f64c922cd" (UID: "f9c9d623-ef25-45c6-a9c8-f34f64c922cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.469140 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.469171 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.469181 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.469207 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8p8t\" (UniqueName: \"kubernetes.io/projected/f9c9d623-ef25-45c6-a9c8-f34f64c922cd-kube-api-access-l8p8t\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.797552 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vnlc" event={"ID":"f9c9d623-ef25-45c6-a9c8-f34f64c922cd","Type":"ContainerDied","Data":"b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc"} Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.797603 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b60cc4bc5868b2d1acc2177585799bb48a52a83c80e7d8abe21549a384020cbc" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.797676 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vnlc" Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.964076 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.964358 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-log" containerID="cri-o://3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234" gracePeriod=30 Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.964448 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-api" containerID="cri-o://e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56" gracePeriod=30 Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.976014 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.976260 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-log" containerID="cri-o://2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3" gracePeriod=30 Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.976350 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-metadata" containerID="cri-o://ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a" gracePeriod=30 Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.997747 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:03 crc kubenswrapper[4894]: I0613 05:08:03.997930 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerName="nova-scheduler-scheduler" containerID="cri-o://fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" gracePeriod=30 Jun 13 05:08:04 crc kubenswrapper[4894]: I0613 05:08:04.812849 4894 generic.go:334] "Generic (PLEG): container finished" podID="48747a42-da20-4252-afd4-bc1a71b7f786" containerID="3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234" exitCode=143 Jun 13 05:08:04 crc kubenswrapper[4894]: I0613 05:08:04.812944 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerDied","Data":"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234"} Jun 13 05:08:04 crc kubenswrapper[4894]: I0613 05:08:04.815956 4894 generic.go:334] "Generic (PLEG): container finished" podID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerID="2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3" exitCode=143 Jun 13 05:08:04 crc kubenswrapper[4894]: I0613 05:08:04.816039 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerDied","Data":"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3"} Jun 13 05:08:05 crc kubenswrapper[4894]: E0613 05:08:05.362329 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:08:05 crc kubenswrapper[4894]: E0613 05:08:05.363700 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:08:05 crc kubenswrapper[4894]: E0613 05:08:05.365337 4894 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jun 13 05:08:05 crc kubenswrapper[4894]: E0613 05:08:05.365386 4894 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerName="nova-scheduler-scheduler" Jun 13 05:08:06 crc kubenswrapper[4894]: I0613 05:08:06.330116 4894 scope.go:117] "RemoveContainer" containerID="61cc63063a9af5968d10a448fc18e286abc113d8a9af96351c1e811904a0abc6" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.443600 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.540363 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data\") pod \"b93028dd-00f4-4cb9-821b-fd722a7e1779\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.540470 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm6vn\" (UniqueName: \"kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn\") pod \"b93028dd-00f4-4cb9-821b-fd722a7e1779\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.540542 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle\") pod \"b93028dd-00f4-4cb9-821b-fd722a7e1779\" (UID: \"b93028dd-00f4-4cb9-821b-fd722a7e1779\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.565794 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.598595 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b93028dd-00f4-4cb9-821b-fd722a7e1779" (UID: "b93028dd-00f4-4cb9-821b-fd722a7e1779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.599555 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data" (OuterVolumeSpecName: "config-data") pod "b93028dd-00f4-4cb9-821b-fd722a7e1779" (UID: "b93028dd-00f4-4cb9-821b-fd722a7e1779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.599626 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn" (OuterVolumeSpecName: "kube-api-access-jm6vn") pod "b93028dd-00f4-4cb9-821b-fd722a7e1779" (UID: "b93028dd-00f4-4cb9-821b-fd722a7e1779"). InnerVolumeSpecName "kube-api-access-jm6vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.643563 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle\") pod \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644119 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgflv\" (UniqueName: \"kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv\") pod \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644161 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs\") pod \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644187 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data\") pod \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644243 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs\") pod \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\" (UID: \"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae\") " Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644608 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs" (OuterVolumeSpecName: "logs") pod "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" (UID: "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644801 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644815 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644825 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm6vn\" (UniqueName: \"kubernetes.io/projected/b93028dd-00f4-4cb9-821b-fd722a7e1779-kube-api-access-jm6vn\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.644834 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93028dd-00f4-4cb9-821b-fd722a7e1779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.658182 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv" (OuterVolumeSpecName: "kube-api-access-tgflv") pod "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" (UID: "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae"). InnerVolumeSpecName "kube-api-access-tgflv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.664955 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" (UID: "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.665807 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data" (OuterVolumeSpecName: "config-data") pod "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" (UID: "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.691851 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" (UID: "4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.746645 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.746691 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgflv\" (UniqueName: \"kubernetes.io/projected/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-kube-api-access-tgflv\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.746703 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.746712 4894 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.842352 4894 generic.go:334] "Generic (PLEG): container finished" podID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" exitCode=0 Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.842415 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b93028dd-00f4-4cb9-821b-fd722a7e1779","Type":"ContainerDied","Data":"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6"} Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.842477 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b93028dd-00f4-4cb9-821b-fd722a7e1779","Type":"ContainerDied","Data":"6160c1975e531821165944b8b1500a611f850a8cda4eeae892302c49b4f026cd"} Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.842497 4894 scope.go:117] "RemoveContainer" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.842569 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.844720 4894 generic.go:334] "Generic (PLEG): container finished" podID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerID="ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a" exitCode=0 Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.844835 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerDied","Data":"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a"} Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.844925 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae","Type":"ContainerDied","Data":"ae0846eedebf891117c8abb8837a1b0d5a2d470206b2ed1a8148fd2181cc2a89"} Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.845029 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.864390 4894 scope.go:117] "RemoveContainer" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.864843 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6\": container with ID starting with fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6 not found: ID does not exist" containerID="fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.864962 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6"} err="failed to get container status \"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6\": rpc error: code = NotFound desc = could not find container \"fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6\": container with ID starting with fd86967fc2d7a334ca4f365155af779489afd690e8109c9dbef973087862f6b6 not found: ID does not exist" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.865049 4894 scope.go:117] "RemoveContainer" containerID="ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.884578 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.885555 4894 scope.go:117] "RemoveContainer" containerID="2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.898685 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.907242 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.914267 4894 scope.go:117] "RemoveContainer" containerID="ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.915543 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.915969 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a\": container with ID starting with ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a not found: ID does not exist" containerID="ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.916086 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a"} err="failed to get container status \"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a\": rpc error: code = NotFound desc = could not find container \"ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a\": container with ID starting with ce7bad16433ec42c0b01d3ca6c8aafc38d2e42078f25c68621d1198c20e60e2a not found: ID does not exist" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.916163 4894 scope.go:117] "RemoveContainer" containerID="2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3" Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.916896 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3\": container with ID starting with 2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3 not found: ID does not exist" containerID="2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.916930 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3"} err="failed to get container status \"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3\": rpc error: code = NotFound desc = could not find container \"2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3\": container with ID starting with 2b5aa0e7d638b4b99bbee0f12600a390865393e479cdfd31f3b2e0dbc78034a3 not found: ID does not exist" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.926381 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.926816 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-metadata" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.926834 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-metadata" Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.926850 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerName="nova-scheduler-scheduler" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.926857 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerName="nova-scheduler-scheduler" Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.926869 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-log" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.926875 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-log" Jun 13 05:08:07 crc kubenswrapper[4894]: E0613 05:08:07.926888 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c9d623-ef25-45c6-a9c8-f34f64c922cd" containerName="nova-manage" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.926895 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c9d623-ef25-45c6-a9c8-f34f64c922cd" containerName="nova-manage" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.927045 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-log" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.927062 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" containerName="nova-scheduler-scheduler" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.927070 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c9d623-ef25-45c6-a9c8-f34f64c922cd" containerName="nova-manage" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.927078 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" containerName="nova-metadata-metadata" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.927984 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.931428 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.934706 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.940462 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.941918 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.943521 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.963787 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:07 crc kubenswrapper[4894]: I0613 05:08:07.976131 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053067 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxbb\" (UniqueName: \"kubernetes.io/projected/8c7b253d-3144-4952-b97b-c65d19b4524a-kube-api-access-5zxbb\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053156 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053187 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-config-data\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053234 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053278 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053348 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053450 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7b253d-3144-4952-b97b-c65d19b4524a-logs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.053495 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqmn\" (UniqueName: \"kubernetes.io/projected/ff418d80-000a-45be-932e-6b1705b9ab49-kube-api-access-jtqmn\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.155816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.155880 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-config-data\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.155925 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.155993 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.156023 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.156125 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7b253d-3144-4952-b97b-c65d19b4524a-logs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.156187 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqmn\" (UniqueName: \"kubernetes.io/projected/ff418d80-000a-45be-932e-6b1705b9ab49-kube-api-access-jtqmn\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.156260 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxbb\" (UniqueName: \"kubernetes.io/projected/8c7b253d-3144-4952-b97b-c65d19b4524a-kube-api-access-5zxbb\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.157193 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7b253d-3144-4952-b97b-c65d19b4524a-logs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.159818 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.159860 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.161501 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-config-data\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.161856 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff418d80-000a-45be-932e-6b1705b9ab49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.162506 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7b253d-3144-4952-b97b-c65d19b4524a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.186747 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxbb\" (UniqueName: \"kubernetes.io/projected/8c7b253d-3144-4952-b97b-c65d19b4524a-kube-api-access-5zxbb\") pod \"nova-metadata-0\" (UID: \"8c7b253d-3144-4952-b97b-c65d19b4524a\") " pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.187494 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqmn\" (UniqueName: \"kubernetes.io/projected/ff418d80-000a-45be-932e-6b1705b9ab49-kube-api-access-jtqmn\") pod \"nova-scheduler-0\" (UID: \"ff418d80-000a-45be-932e-6b1705b9ab49\") " pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.249293 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.264330 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.293345 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae" path="/var/lib/kubelet/pods/4cf4c61b-e0fe-4814-8c06-a0ee5ba102ae/volumes" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.294005 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93028dd-00f4-4cb9-821b-fd722a7e1779" path="/var/lib/kubelet/pods/b93028dd-00f4-4cb9-821b-fd722a7e1779/volumes" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.759592 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.848267 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.855301 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.855749 4894 generic.go:334] "Generic (PLEG): container finished" podID="48747a42-da20-4252-afd4-bc1a71b7f786" containerID="e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56" exitCode=0 Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.855799 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerDied","Data":"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56"} Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.855822 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48747a42-da20-4252-afd4-bc1a71b7f786","Type":"ContainerDied","Data":"b3d792499820c278d56a155589931b8329e6bf574551e089a35792140fdd7496"} Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.855836 4894 scope.go:117] "RemoveContainer" containerID="e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.857288 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff418d80-000a-45be-932e-6b1705b9ab49","Type":"ContainerStarted","Data":"4302617e392bc1a6f48bc939aacf1fbb3caef3ce31fd0496bfb381170513504b"} Jun 13 05:08:08 crc kubenswrapper[4894]: W0613 05:08:08.860431 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c7b253d_3144_4952_b97b_c65d19b4524a.slice/crio-8213a0c261f4e6bc2773fb589a362f991c0c9b610ab8aa01c503fd9f2e76ddf0 WatchSource:0}: Error finding container 8213a0c261f4e6bc2773fb589a362f991c0c9b610ab8aa01c503fd9f2e76ddf0: Status 404 returned error can't find the container with id 8213a0c261f4e6bc2773fb589a362f991c0c9b610ab8aa01c503fd9f2e76ddf0 Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.894577 4894 scope.go:117] "RemoveContainer" containerID="3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.914019 4894 scope.go:117] "RemoveContainer" containerID="e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56" Jun 13 05:08:08 crc kubenswrapper[4894]: E0613 05:08:08.914668 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56\": container with ID starting with e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56 not found: ID does not exist" containerID="e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.914696 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56"} err="failed to get container status \"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56\": rpc error: code = NotFound desc = could not find container \"e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56\": container with ID starting with e04bc9e93822dfd5f8c59f32e402de408555eb69776a112e1666eaee4acf9d56 not found: ID does not exist" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.914719 4894 scope.go:117] "RemoveContainer" containerID="3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234" Jun 13 05:08:08 crc kubenswrapper[4894]: E0613 05:08:08.915066 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234\": container with ID starting with 3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234 not found: ID does not exist" containerID="3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.915084 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234"} err="failed to get container status \"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234\": rpc error: code = NotFound desc = could not find container \"3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234\": container with ID starting with 3cb0e77ab53859639fdb9186459d805f453eb121041d2dcb3fa25a2df1a78234 not found: ID does not exist" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.970637 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs\") pod \"48747a42-da20-4252-afd4-bc1a71b7f786\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.970966 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data\") pod \"48747a42-da20-4252-afd4-bc1a71b7f786\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.971050 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle\") pod \"48747a42-da20-4252-afd4-bc1a71b7f786\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.971156 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np72j\" (UniqueName: \"kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j\") pod \"48747a42-da20-4252-afd4-bc1a71b7f786\" (UID: \"48747a42-da20-4252-afd4-bc1a71b7f786\") " Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.971367 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs" (OuterVolumeSpecName: "logs") pod "48747a42-da20-4252-afd4-bc1a71b7f786" (UID: "48747a42-da20-4252-afd4-bc1a71b7f786"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.971784 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48747a42-da20-4252-afd4-bc1a71b7f786-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:08 crc kubenswrapper[4894]: I0613 05:08:08.975247 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j" (OuterVolumeSpecName: "kube-api-access-np72j") pod "48747a42-da20-4252-afd4-bc1a71b7f786" (UID: "48747a42-da20-4252-afd4-bc1a71b7f786"). InnerVolumeSpecName "kube-api-access-np72j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.001544 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48747a42-da20-4252-afd4-bc1a71b7f786" (UID: "48747a42-da20-4252-afd4-bc1a71b7f786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.004886 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data" (OuterVolumeSpecName: "config-data") pod "48747a42-da20-4252-afd4-bc1a71b7f786" (UID: "48747a42-da20-4252-afd4-bc1a71b7f786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.073074 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.073107 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48747a42-da20-4252-afd4-bc1a71b7f786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.073121 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np72j\" (UniqueName: \"kubernetes.io/projected/48747a42-da20-4252-afd4-bc1a71b7f786-kube-api-access-np72j\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.872040 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7b253d-3144-4952-b97b-c65d19b4524a","Type":"ContainerStarted","Data":"47eae584e2b4ac6c69f3579a6a7cadf2c96a46a4eb6bff1d0a191938971eaa0c"} Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.872113 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7b253d-3144-4952-b97b-c65d19b4524a","Type":"ContainerStarted","Data":"a3c29382aefef3433436d1784ea29be2df14f53a7a2250fabee826ab3a8128a6"} Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.872134 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7b253d-3144-4952-b97b-c65d19b4524a","Type":"ContainerStarted","Data":"8213a0c261f4e6bc2773fb589a362f991c0c9b610ab8aa01c503fd9f2e76ddf0"} Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.873643 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.875824 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff418d80-000a-45be-932e-6b1705b9ab49","Type":"ContainerStarted","Data":"d97b72d53760eb39836aeb6dd46956031da5c4efa8adc577395fdcbced893c8b"} Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.918175 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.918150602 podStartE2EDuration="2.918150602s" podCreationTimestamp="2025-06-13 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:09.912125251 +0000 UTC m=+1048.358372714" watchObservedRunningTime="2025-06-13 05:08:09.918150602 +0000 UTC m=+1048.364398095" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.943156 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.943140342 podStartE2EDuration="2.943140342s" podCreationTimestamp="2025-06-13 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:09.936645037 +0000 UTC m=+1048.382892530" watchObservedRunningTime="2025-06-13 05:08:09.943140342 +0000 UTC m=+1048.389387795" Jun 13 05:08:09 crc kubenswrapper[4894]: I0613 05:08:09.985389 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.007218 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.028514 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:10 crc kubenswrapper[4894]: E0613 05:08:10.029316 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-api" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.029472 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-api" Jun 13 05:08:10 crc kubenswrapper[4894]: E0613 05:08:10.029587 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-log" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.029691 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-log" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.030022 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-log" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.030143 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" containerName="nova-api-api" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.031459 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.036814 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.045526 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.097158 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.097226 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.097422 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxkh\" (UniqueName: \"kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.097627 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.199780 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.199875 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxkh\" (UniqueName: \"kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.199940 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.200014 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.200223 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.204872 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.213182 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.220918 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxkh\" (UniqueName: \"kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh\") pod \"nova-api-0\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.287805 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48747a42-da20-4252-afd4-bc1a71b7f786" path="/var/lib/kubelet/pods/48747a42-da20-4252-afd4-bc1a71b7f786/volumes" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.345030 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.613066 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:10 crc kubenswrapper[4894]: W0613 05:08:10.616996 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c1bd681_c24e_4c62_bcc0_89ff1cbb3e26.slice/crio-5572a05c90ab584777c6716b615c39a3421bbd292890fbb4f13e3cfb08657fb1 WatchSource:0}: Error finding container 5572a05c90ab584777c6716b615c39a3421bbd292890fbb4f13e3cfb08657fb1: Status 404 returned error can't find the container with id 5572a05c90ab584777c6716b615c39a3421bbd292890fbb4f13e3cfb08657fb1 Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.887506 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerStarted","Data":"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10"} Jun 13 05:08:10 crc kubenswrapper[4894]: I0613 05:08:10.887867 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerStarted","Data":"5572a05c90ab584777c6716b615c39a3421bbd292890fbb4f13e3cfb08657fb1"} Jun 13 05:08:11 crc kubenswrapper[4894]: I0613 05:08:11.904064 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerStarted","Data":"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863"} Jun 13 05:08:11 crc kubenswrapper[4894]: I0613 05:08:11.948141 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.948112042 podStartE2EDuration="2.948112042s" podCreationTimestamp="2025-06-13 05:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:11.934519166 +0000 UTC m=+1050.380766669" watchObservedRunningTime="2025-06-13 05:08:11.948112042 +0000 UTC m=+1050.394359535" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.244791 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-svf7z"] Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.245061 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-svf7z" podUID="f8b99387-e228-48fe-96e5-f6037cc54933" containerName="container-00" containerID="cri-o://cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6" gracePeriod=2 Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.259891 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-svf7z"] Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.355564 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-svf7z" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.448730 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host\") pod \"f8b99387-e228-48fe-96e5-f6037cc54933\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.448856 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host" (OuterVolumeSpecName: "host") pod "f8b99387-e228-48fe-96e5-f6037cc54933" (UID: "f8b99387-e228-48fe-96e5-f6037cc54933"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.448905 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzk9l\" (UniqueName: \"kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l\") pod \"f8b99387-e228-48fe-96e5-f6037cc54933\" (UID: \"f8b99387-e228-48fe-96e5-f6037cc54933\") " Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.449728 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8b99387-e228-48fe-96e5-f6037cc54933-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.455859 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l" (OuterVolumeSpecName: "kube-api-access-tzk9l") pod "f8b99387-e228-48fe-96e5-f6037cc54933" (UID: "f8b99387-e228-48fe-96e5-f6037cc54933"). InnerVolumeSpecName "kube-api-access-tzk9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.551783 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzk9l\" (UniqueName: \"kubernetes.io/projected/f8b99387-e228-48fe-96e5-f6037cc54933-kube-api-access-tzk9l\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.917817 4894 generic.go:334] "Generic (PLEG): container finished" podID="f8b99387-e228-48fe-96e5-f6037cc54933" containerID="cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6" exitCode=0 Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.917975 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-svf7z" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.918005 4894 scope.go:117] "RemoveContainer" containerID="cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.958122 4894 scope.go:117] "RemoveContainer" containerID="cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6" Jun 13 05:08:12 crc kubenswrapper[4894]: E0613 05:08:12.958481 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6\": container with ID starting with cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6 not found: ID does not exist" containerID="cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6" Jun 13 05:08:12 crc kubenswrapper[4894]: I0613 05:08:12.958557 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6"} err="failed to get container status \"cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6\": rpc error: code = NotFound desc = could not find container \"cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6\": container with ID starting with cdc83ba4be30da9bf67455e693f78e5780ae7401664c827fc17532ad6f05eee6 not found: ID does not exist" Jun 13 05:08:13 crc kubenswrapper[4894]: I0613 05:08:13.250817 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jun 13 05:08:13 crc kubenswrapper[4894]: I0613 05:08:13.251157 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jun 13 05:08:13 crc kubenswrapper[4894]: I0613 05:08:13.264791 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jun 13 05:08:14 crc kubenswrapper[4894]: I0613 05:08:14.290827 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b99387-e228-48fe-96e5-f6037cc54933" path="/var/lib/kubelet/pods/f8b99387-e228-48fe-96e5-f6037cc54933/volumes" Jun 13 05:08:18 crc kubenswrapper[4894]: I0613 05:08:18.250737 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jun 13 05:08:18 crc kubenswrapper[4894]: I0613 05:08:18.252833 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jun 13 05:08:18 crc kubenswrapper[4894]: I0613 05:08:18.265539 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jun 13 05:08:18 crc kubenswrapper[4894]: I0613 05:08:18.300107 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jun 13 05:08:19 crc kubenswrapper[4894]: I0613 05:08:19.013402 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jun 13 05:08:19 crc kubenswrapper[4894]: I0613 05:08:19.270928 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7b253d-3144-4952-b97b-c65d19b4524a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:19 crc kubenswrapper[4894]: I0613 05:08:19.271627 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7b253d-3144-4952-b97b-c65d19b4524a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:20 crc kubenswrapper[4894]: I0613 05:08:20.346835 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:20 crc kubenswrapper[4894]: I0613 05:08:20.347892 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:21 crc kubenswrapper[4894]: I0613 05:08:21.428935 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:21 crc kubenswrapper[4894]: I0613 05:08:21.429105 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:26 crc kubenswrapper[4894]: I0613 05:08:26.236983 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:08:26 crc kubenswrapper[4894]: I0613 05:08:26.237565 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:08:28 crc kubenswrapper[4894]: I0613 05:08:28.259434 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jun 13 05:08:28 crc kubenswrapper[4894]: I0613 05:08:28.260431 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jun 13 05:08:28 crc kubenswrapper[4894]: I0613 05:08:28.309594 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jun 13 05:08:29 crc kubenswrapper[4894]: I0613 05:08:29.130634 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jun 13 05:08:30 crc kubenswrapper[4894]: I0613 05:08:30.354313 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jun 13 05:08:30 crc kubenswrapper[4894]: I0613 05:08:30.356116 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jun 13 05:08:30 crc kubenswrapper[4894]: I0613 05:08:30.356759 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jun 13 05:08:30 crc kubenswrapper[4894]: I0613 05:08:30.366266 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.138315 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.142802 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.395395 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:08:31 crc kubenswrapper[4894]: E0613 05:08:31.396718 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b99387-e228-48fe-96e5-f6037cc54933" containerName="container-00" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.396799 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b99387-e228-48fe-96e5-f6037cc54933" containerName="container-00" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.397029 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b99387-e228-48fe-96e5-f6037cc54933" containerName="container-00" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.397972 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.411955 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.430639 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.431017 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.431186 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.431264 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.431371 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lkt\" (UniqueName: \"kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.533112 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lkt\" (UniqueName: \"kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.533318 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.533417 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.533497 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.533572 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.534377 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.534645 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.534969 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.535489 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.559853 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lkt\" (UniqueName: \"kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt\") pod \"dnsmasq-dns-746fb47d4f-4w269\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:31 crc kubenswrapper[4894]: I0613 05:08:31.738874 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.015092 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.145630 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" event={"ID":"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311","Type":"ContainerStarted","Data":"7d7cf133e4a23db2071bd4a967c6eef1922c96114c72238f9532b687bbab7123"} Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.564788 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.565373 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-central-agent" containerID="cri-o://4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea" gracePeriod=30 Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.565483 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="proxy-httpd" containerID="cri-o://a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a" gracePeriod=30 Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.565518 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="sg-core" containerID="cri-o://8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56" gracePeriod=30 Jun 13 05:08:32 crc kubenswrapper[4894]: I0613 05:08:32.565550 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-notification-agent" containerID="cri-o://372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093" gracePeriod=30 Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.156846 4894 generic.go:334] "Generic (PLEG): container finished" podID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerID="97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f" exitCode=0 Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.157169 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" event={"ID":"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311","Type":"ContainerDied","Data":"97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f"} Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.168778 4894 generic.go:334] "Generic (PLEG): container finished" podID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerID="a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a" exitCode=0 Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.168805 4894 generic.go:334] "Generic (PLEG): container finished" podID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerID="8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56" exitCode=2 Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.169520 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerDied","Data":"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a"} Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.169551 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerDied","Data":"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56"} Jun 13 05:08:33 crc kubenswrapper[4894]: I0613 05:08:33.972996 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.180517 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" event={"ID":"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311","Type":"ContainerStarted","Data":"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691"} Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.180670 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.184795 4894 generic.go:334] "Generic (PLEG): container finished" podID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerID="4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea" exitCode=0 Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.184891 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerDied","Data":"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea"} Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.185092 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-log" containerID="cri-o://cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10" gracePeriod=30 Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.185224 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-api" containerID="cri-o://c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863" gracePeriod=30 Jun 13 05:08:34 crc kubenswrapper[4894]: I0613 05:08:34.198073 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" podStartSLOduration=3.198060102 podStartE2EDuration="3.198060102s" podCreationTimestamp="2025-06-13 05:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:34.196198779 +0000 UTC m=+1072.642446242" watchObservedRunningTime="2025-06-13 05:08:34.198060102 +0000 UTC m=+1072.644307565" Jun 13 05:08:35 crc kubenswrapper[4894]: I0613 05:08:35.196686 4894 generic.go:334] "Generic (PLEG): container finished" podID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerID="cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10" exitCode=143 Jun 13 05:08:35 crc kubenswrapper[4894]: I0613 05:08:35.196795 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerDied","Data":"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10"} Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.092514 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135567 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135640 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135682 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135736 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dddm\" (UniqueName: \"kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135841 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135875 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135919 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.135947 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts\") pod \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\" (UID: \"55e0b9d8-63c6-48d1-88e3-2402eb6002a2\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.139179 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.139322 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.142786 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts" (OuterVolumeSpecName: "scripts") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.148407 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm" (OuterVolumeSpecName: "kube-api-access-8dddm") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "kube-api-access-8dddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.199176 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237788 4894 generic.go:334] "Generic (PLEG): container finished" podID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerID="372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093" exitCode=0 Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237825 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerDied","Data":"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093"} Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237853 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55e0b9d8-63c6-48d1-88e3-2402eb6002a2","Type":"ContainerDied","Data":"3e0e645ce26a3dfac3cb8c5fba8e5bc15dfe8da946fcbe19c63caa0da8226d91"} Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237870 4894 scope.go:117] "RemoveContainer" containerID="a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237937 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237961 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237970 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237981 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.237992 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dddm\" (UniqueName: \"kubernetes.io/projected/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-kube-api-access-8dddm\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.238001 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.249891 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.274036 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.277931 4894 scope.go:117] "RemoveContainer" containerID="8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.295252 4894 scope.go:117] "RemoveContainer" containerID="372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.320369 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data" (OuterVolumeSpecName: "config-data") pod "55e0b9d8-63c6-48d1-88e3-2402eb6002a2" (UID: "55e0b9d8-63c6-48d1-88e3-2402eb6002a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.321479 4894 scope.go:117] "RemoveContainer" containerID="4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.340470 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.340495 4894 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.340506 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e0b9d8-63c6-48d1-88e3-2402eb6002a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.344561 4894 scope.go:117] "RemoveContainer" containerID="a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.344961 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a\": container with ID starting with a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a not found: ID does not exist" containerID="a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.344987 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a"} err="failed to get container status \"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a\": rpc error: code = NotFound desc = could not find container \"a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a\": container with ID starting with a27520e44f9f0c773e2d39561af88967136a7e8acb3ac8eef3bea1df2d35a38a not found: ID does not exist" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.345006 4894 scope.go:117] "RemoveContainer" containerID="8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.345329 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56\": container with ID starting with 8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56 not found: ID does not exist" containerID="8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.345364 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56"} err="failed to get container status \"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56\": rpc error: code = NotFound desc = could not find container \"8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56\": container with ID starting with 8524e943c11968616669ee3d60e6387d7b54f8a27aa7ac01eb65e6229cc8dd56 not found: ID does not exist" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.345388 4894 scope.go:117] "RemoveContainer" containerID="372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.345751 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093\": container with ID starting with 372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093 not found: ID does not exist" containerID="372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.345776 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093"} err="failed to get container status \"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093\": rpc error: code = NotFound desc = could not find container \"372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093\": container with ID starting with 372b886a8516cdb609ed02f48dfd762560c9fd3822cb3ff6700f7a2728e42093 not found: ID does not exist" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.345790 4894 scope.go:117] "RemoveContainer" containerID="4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.346106 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea\": container with ID starting with 4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea not found: ID does not exist" containerID="4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.346162 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea"} err="failed to get container status \"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea\": rpc error: code = NotFound desc = could not find container \"4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea\": container with ID starting with 4d27e210b3c396769944bb6baeb7ee2797fdc263b6962b4f463e82eab49f85ea not found: ID does not exist" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.628459 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.656701 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.667473 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.667941 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-central-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.667963 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-central-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.667991 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-notification-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.667998 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-notification-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.668018 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="sg-core" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668024 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="sg-core" Jun 13 05:08:37 crc kubenswrapper[4894]: E0613 05:08:37.668039 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="proxy-httpd" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668045 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="proxy-httpd" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668256 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-notification-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668265 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="proxy-httpd" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668298 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="ceilometer-central-agent" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.668319 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" containerName="sg-core" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.669998 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.671926 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.673412 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.673585 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.683249 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.701400 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753198 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle\") pod \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753326 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs\") pod \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753345 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data\") pod \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753421 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxkh\" (UniqueName: \"kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh\") pod \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\" (UID: \"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26\") " Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753729 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753768 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753828 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk7m9\" (UniqueName: \"kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753864 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.753892 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.754082 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.754099 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs" (OuterVolumeSpecName: "logs") pod "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" (UID: "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.754297 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.754424 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.754595 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.758791 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh" (OuterVolumeSpecName: "kube-api-access-pbxkh") pod "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" (UID: "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26"). InnerVolumeSpecName "kube-api-access-pbxkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.794313 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" (UID: "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.794767 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data" (OuterVolumeSpecName: "config-data") pod "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" (UID: "4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.855917 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.855974 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856035 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk7m9\" (UniqueName: \"kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856071 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856103 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856123 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856162 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856185 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856230 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856242 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxkh\" (UniqueName: \"kubernetes.io/projected/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-kube-api-access-pbxkh\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.856253 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.858270 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.859593 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.859916 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.861162 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.862892 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.863212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.863309 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.879307 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk7m9\" (UniqueName: \"kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9\") pod \"ceilometer-0\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " pod="openstack/ceilometer-0" Jun 13 05:08:37 crc kubenswrapper[4894]: I0613 05:08:37.985801 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.263876 4894 generic.go:334] "Generic (PLEG): container finished" podID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerID="c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863" exitCode=0 Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.263904 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerDied","Data":"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863"} Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.263923 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26","Type":"ContainerDied","Data":"5572a05c90ab584777c6716b615c39a3421bbd292890fbb4f13e3cfb08657fb1"} Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.263941 4894 scope.go:117] "RemoveContainer" containerID="c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.264032 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.304690 4894 scope.go:117] "RemoveContainer" containerID="cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.308469 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e0b9d8-63c6-48d1-88e3-2402eb6002a2" path="/var/lib/kubelet/pods/55e0b9d8-63c6-48d1-88e3-2402eb6002a2/volumes" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.315524 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.324324 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.334497 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:38 crc kubenswrapper[4894]: E0613 05:08:38.334903 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-log" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.334921 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-log" Jun 13 05:08:38 crc kubenswrapper[4894]: E0613 05:08:38.334937 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-api" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.334943 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-api" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.336439 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-api" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.336466 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" containerName="nova-api-log" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.337406 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.339159 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.339324 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.342797 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.346367 4894 scope.go:117] "RemoveContainer" containerID="c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.353982 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:38 crc kubenswrapper[4894]: E0613 05:08:38.355133 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863\": container with ID starting with c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863 not found: ID does not exist" containerID="c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.355171 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863"} err="failed to get container status \"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863\": rpc error: code = NotFound desc = could not find container \"c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863\": container with ID starting with c62ca1ebd872488e2f57115fcd3c8d031195b2e9f8120bc1ed3c6ee1e6db9863 not found: ID does not exist" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.355197 4894 scope.go:117] "RemoveContainer" containerID="cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10" Jun 13 05:08:38 crc kubenswrapper[4894]: E0613 05:08:38.356582 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10\": container with ID starting with cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10 not found: ID does not exist" containerID="cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.356626 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10"} err="failed to get container status \"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10\": rpc error: code = NotFound desc = could not find container \"cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10\": container with ID starting with cd7f714250594d94a50ced33b74a56cb279a5dff3c0138ebcfc437adc8d6fd10 not found: ID does not exist" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.363846 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxrj\" (UniqueName: \"kubernetes.io/projected/e7c817a0-08df-456b-a791-53dd7ad019b6-kube-api-access-zmxrj\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.363959 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.364104 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.364194 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.364279 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c817a0-08df-456b-a791-53dd7ad019b6-logs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.364398 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-config-data\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466068 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466116 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466141 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c817a0-08df-456b-a791-53dd7ad019b6-logs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466213 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-config-data\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466259 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxrj\" (UniqueName: \"kubernetes.io/projected/e7c817a0-08df-456b-a791-53dd7ad019b6-kube-api-access-zmxrj\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466279 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.466846 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7c817a0-08df-456b-a791-53dd7ad019b6-logs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.471238 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.472516 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.472693 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-config-data\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.476400 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c817a0-08df-456b-a791-53dd7ad019b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.483383 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.489715 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.491074 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxrj\" (UniqueName: \"kubernetes.io/projected/e7c817a0-08df-456b-a791-53dd7ad019b6-kube-api-access-zmxrj\") pod \"nova-api-0\" (UID: \"e7c817a0-08df-456b-a791-53dd7ad019b6\") " pod="openstack/nova-api-0" Jun 13 05:08:38 crc kubenswrapper[4894]: I0613 05:08:38.656362 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jun 13 05:08:39 crc kubenswrapper[4894]: W0613 05:08:39.116233 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c817a0_08df_456b_a791_53dd7ad019b6.slice/crio-48030a6b3145e575895568e7e5daed5842c635c8ba74b2efd9cb1fb23dae4b81 WatchSource:0}: Error finding container 48030a6b3145e575895568e7e5daed5842c635c8ba74b2efd9cb1fb23dae4b81: Status 404 returned error can't find the container with id 48030a6b3145e575895568e7e5daed5842c635c8ba74b2efd9cb1fb23dae4b81 Jun 13 05:08:39 crc kubenswrapper[4894]: I0613 05:08:39.119562 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jun 13 05:08:39 crc kubenswrapper[4894]: I0613 05:08:39.278249 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerStarted","Data":"077889bbf97da40d8df77f017b7649eb02434343a5e2598c53a9bdcaeb6b31ed"} Jun 13 05:08:39 crc kubenswrapper[4894]: I0613 05:08:39.280224 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7c817a0-08df-456b-a791-53dd7ad019b6","Type":"ContainerStarted","Data":"48030a6b3145e575895568e7e5daed5842c635c8ba74b2efd9cb1fb23dae4b81"} Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.288128 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26" path="/var/lib/kubelet/pods/4c1bd681-c24e-4c62-bcc0-89ff1cbb3e26/volumes" Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.289667 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7c817a0-08df-456b-a791-53dd7ad019b6","Type":"ContainerStarted","Data":"a1fe69091544ccd4ded4494e324774c33e2ed42e3e6f3897fb4a366c761e5703"} Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.289693 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7c817a0-08df-456b-a791-53dd7ad019b6","Type":"ContainerStarted","Data":"813b0b477568d08825e5206ea8d1e767653480eaf60e68f9f26138add6afa2e3"} Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.290846 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerStarted","Data":"eb6be4eb3829fb0983ce959415ddcf6ed1bc0b48a543c359cc7a054cd85f8529"} Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.290892 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerStarted","Data":"3f6af05f49cdc90780781567aa8c5da8fa435aedfd3e02248ebb3daeac2c7aa9"} Jun 13 05:08:40 crc kubenswrapper[4894]: I0613 05:08:40.321278 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.321255777 podStartE2EDuration="2.321255777s" podCreationTimestamp="2025-06-13 05:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:08:40.307766054 +0000 UTC m=+1078.754013517" watchObservedRunningTime="2025-06-13 05:08:40.321255777 +0000 UTC m=+1078.767503240" Jun 13 05:08:41 crc kubenswrapper[4894]: I0613 05:08:41.305723 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerStarted","Data":"8357472a0e3cd45a72f60a4d0236f956f454fc4df10370e57d1e3a455687a26f"} Jun 13 05:08:41 crc kubenswrapper[4894]: I0613 05:08:41.740420 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:08:41 crc kubenswrapper[4894]: I0613 05:08:41.823669 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:08:41 crc kubenswrapper[4894]: I0613 05:08:41.823968 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="dnsmasq-dns" containerID="cri-o://e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314" gracePeriod=10 Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.378864 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.380934 4894 generic.go:334] "Generic (PLEG): container finished" podID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerID="e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314" exitCode=0 Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.380992 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" event={"ID":"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff","Type":"ContainerDied","Data":"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314"} Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.381017 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" event={"ID":"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff","Type":"ContainerDied","Data":"443b8f0201ef95004c770e52f6dd3fadcef7d44092eb8700d7e533eaf6d4f043"} Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.381032 4894 scope.go:117] "RemoveContainer" containerID="e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.404955 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerStarted","Data":"fd67d6f163b7585f8eb1f335db268f413b8c4e3d89ae1fbb9865187a0f5715b0"} Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.405963 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.415461 4894 scope.go:117] "RemoveContainer" containerID="78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.455012 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.455164 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.455226 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6bh8\" (UniqueName: \"kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.455253 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.455318 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.475675 4894 scope.go:117] "RemoveContainer" containerID="e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314" Jun 13 05:08:42 crc kubenswrapper[4894]: E0613 05:08:42.476626 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314\": container with ID starting with e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314 not found: ID does not exist" containerID="e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.476666 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314"} err="failed to get container status \"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314\": rpc error: code = NotFound desc = could not find container \"e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314\": container with ID starting with e8589708ef53c91d19c30b32d355a411f6361f858dba1d92aab1abaa6d750314 not found: ID does not exist" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.476686 4894 scope.go:117] "RemoveContainer" containerID="78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f" Jun 13 05:08:42 crc kubenswrapper[4894]: E0613 05:08:42.478815 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f\": container with ID starting with 78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f not found: ID does not exist" containerID="78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.478835 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f"} err="failed to get container status \"78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f\": rpc error: code = NotFound desc = could not find container \"78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f\": container with ID starting with 78a6e6ff65325f46b0a9a2a2a0d446893a0746dfbd7e6961c32644b0680c127f not found: ID does not exist" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.483419 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.046118063 podStartE2EDuration="5.48341024s" podCreationTimestamp="2025-06-13 05:08:37 +0000 UTC" firstStartedPulling="2025-06-13 05:08:38.489479463 +0000 UTC m=+1076.935726926" lastFinishedPulling="2025-06-13 05:08:41.92677164 +0000 UTC m=+1080.373019103" observedRunningTime="2025-06-13 05:08:42.470614537 +0000 UTC m=+1080.916862000" watchObservedRunningTime="2025-06-13 05:08:42.48341024 +0000 UTC m=+1080.929657703" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.512836 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8" (OuterVolumeSpecName: "kube-api-access-g6bh8") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff"). InnerVolumeSpecName "kube-api-access-g6bh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.570420 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6bh8\" (UniqueName: \"kubernetes.io/projected/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-kube-api-access-g6bh8\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.574281 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.593558 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config" (OuterVolumeSpecName: "config") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:08:42 crc kubenswrapper[4894]: E0613 05:08:42.638946 4894 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc podName:f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff nodeName:}" failed. No retries permitted until 2025-06-13 05:08:43.138921024 +0000 UTC m=+1081.585168477 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff") : error deleting /var/lib/kubelet/pods/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff/volume-subpaths: remove /var/lib/kubelet/pods/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff/volume-subpaths: no such file or directory Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.639199 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.671950 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.671980 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:42 crc kubenswrapper[4894]: I0613 05:08:42.671992 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.181326 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") pod \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\" (UID: \"f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff\") " Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.182267 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" (UID: "f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.284422 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.422311 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59dc98b7b9-ns5ql" Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.471524 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:08:43 crc kubenswrapper[4894]: I0613 05:08:43.486728 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59dc98b7b9-ns5ql"] Jun 13 05:08:44 crc kubenswrapper[4894]: I0613 05:08:44.295158 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" path="/var/lib/kubelet/pods/f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff/volumes" Jun 13 05:08:48 crc kubenswrapper[4894]: I0613 05:08:48.656588 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:48 crc kubenswrapper[4894]: I0613 05:08:48.657224 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jun 13 05:08:49 crc kubenswrapper[4894]: I0613 05:08:49.672817 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e7c817a0-08df-456b-a791-53dd7ad019b6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:49 crc kubenswrapper[4894]: I0613 05:08:49.672866 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e7c817a0-08df-456b-a791-53dd7ad019b6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jun 13 05:08:56 crc kubenswrapper[4894]: I0613 05:08:56.236125 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:08:56 crc kubenswrapper[4894]: I0613 05:08:56.236793 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:08:58 crc kubenswrapper[4894]: I0613 05:08:58.684458 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jun 13 05:08:58 crc kubenswrapper[4894]: I0613 05:08:58.685762 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jun 13 05:08:58 crc kubenswrapper[4894]: I0613 05:08:58.690145 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jun 13 05:08:58 crc kubenswrapper[4894]: I0613 05:08:58.696485 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jun 13 05:08:59 crc kubenswrapper[4894]: I0613 05:08:59.593627 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jun 13 05:08:59 crc kubenswrapper[4894]: I0613 05:08:59.606580 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.574728 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-67vwh"] Jun 13 05:09:01 crc kubenswrapper[4894]: E0613 05:09:01.575300 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="dnsmasq-dns" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.575313 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="dnsmasq-dns" Jun 13 05:09:01 crc kubenswrapper[4894]: E0613 05:09:01.575341 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="init" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.575347 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="init" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.575497 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9da35fe-2b5c-4ec2-8241-e7fd4ba260ff" containerName="dnsmasq-dns" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.576113 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.579193 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.657208 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.657286 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr59\" (UniqueName: \"kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.759209 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smr59\" (UniqueName: \"kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.759566 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.759791 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.791696 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr59\" (UniqueName: \"kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59\") pod \"crc-debug-67vwh\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " pod="openstack/crc-debug-67vwh" Jun 13 05:09:01 crc kubenswrapper[4894]: I0613 05:09:01.901895 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-67vwh" Jun 13 05:09:02 crc kubenswrapper[4894]: I0613 05:09:02.631967 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-67vwh" event={"ID":"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18","Type":"ContainerStarted","Data":"4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b"} Jun 13 05:09:02 crc kubenswrapper[4894]: I0613 05:09:02.632045 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-67vwh" event={"ID":"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18","Type":"ContainerStarted","Data":"7862ae860c563cacd89a93e672bb64137c356ac5eda1a238a42b76ade4d4efd8"} Jun 13 05:09:02 crc kubenswrapper[4894]: I0613 05:09:02.660099 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-67vwh" podStartSLOduration=1.660079572 podStartE2EDuration="1.660079572s" podCreationTimestamp="2025-06-13 05:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:09:02.658230109 +0000 UTC m=+1101.104477582" watchObservedRunningTime="2025-06-13 05:09:02.660079572 +0000 UTC m=+1101.106327045" Jun 13 05:09:06 crc kubenswrapper[4894]: I0613 05:09:06.597866 4894 scope.go:117] "RemoveContainer" containerID="a906cc013dd326881c943e6d5c8c8e55d9bc1073feb9d43c78c73a318f1f8f59" Jun 13 05:09:06 crc kubenswrapper[4894]: I0613 05:09:06.632418 4894 scope.go:117] "RemoveContainer" containerID="9b479e89ad21b47d6f3fa998d22c01556217fb87597ee4a3ee327a1446f0180c" Jun 13 05:09:06 crc kubenswrapper[4894]: I0613 05:09:06.658576 4894 scope.go:117] "RemoveContainer" containerID="1ae3dbbba472a587b388d616e639682714571fc2ef33644a6d169b67e096cdc4" Jun 13 05:09:08 crc kubenswrapper[4894]: I0613 05:09:08.002519 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.590800 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-67vwh"] Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.591678 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-67vwh" podUID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" containerName="container-00" containerID="cri-o://4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b" gracePeriod=2 Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.598325 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-67vwh"] Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.688550 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-67vwh" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.757297 4894 generic.go:334] "Generic (PLEG): container finished" podID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" containerID="4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b" exitCode=0 Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.757403 4894 scope.go:117] "RemoveContainer" containerID="4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.757599 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-67vwh" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.793319 4894 scope.go:117] "RemoveContainer" containerID="4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b" Jun 13 05:09:12 crc kubenswrapper[4894]: E0613 05:09:12.794199 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b\": container with ID starting with 4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b not found: ID does not exist" containerID="4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.794266 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b"} err="failed to get container status \"4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b\": rpc error: code = NotFound desc = could not find container \"4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b\": container with ID starting with 4ba7df6eb44b972013989960a13b70a2c3a56a588fd9368fe95f80d0e311588b not found: ID does not exist" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.794924 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smr59\" (UniqueName: \"kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59\") pod \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.795256 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host\") pod \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\" (UID: \"6cf01f5e-66fb-4df8-979a-9f0b81c5ad18\") " Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.795323 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host" (OuterVolumeSpecName: "host") pod "6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" (UID: "6cf01f5e-66fb-4df8-979a-9f0b81c5ad18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.795913 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.807394 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59" (OuterVolumeSpecName: "kube-api-access-smr59") pod "6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" (UID: "6cf01f5e-66fb-4df8-979a-9f0b81c5ad18"). InnerVolumeSpecName "kube-api-access-smr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:12 crc kubenswrapper[4894]: I0613 05:09:12.898606 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smr59\" (UniqueName: \"kubernetes.io/projected/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18-kube-api-access-smr59\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:14 crc kubenswrapper[4894]: I0613 05:09:14.297936 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" path="/var/lib/kubelet/pods/6cf01f5e-66fb-4df8-979a-9f0b81c5ad18/volumes" Jun 13 05:09:16 crc kubenswrapper[4894]: I0613 05:09:16.927324 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:17 crc kubenswrapper[4894]: I0613 05:09:17.792927 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:21 crc kubenswrapper[4894]: I0613 05:09:21.395297 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="rabbitmq" containerID="cri-o://1777129e8e8a33beb690e0c487643173eccbbdbb009a127f68facceba9c56f78" gracePeriod=604796 Jun 13 05:09:22 crc kubenswrapper[4894]: I0613 05:09:22.110311 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="rabbitmq" containerID="cri-o://b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58" gracePeriod=604796 Jun 13 05:09:23 crc kubenswrapper[4894]: I0613 05:09:23.837052 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jun 13 05:09:24 crc kubenswrapper[4894]: I0613 05:09:24.325231 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.236514 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.236816 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.236852 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.237456 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.237508 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6" gracePeriod=600 Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.899347 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6" exitCode=0 Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.899423 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6"} Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.899960 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf"} Jun 13 05:09:26 crc kubenswrapper[4894]: I0613 05:09:26.900007 4894 scope.go:117] "RemoveContainer" containerID="6367089c0046494147ef17f49cb4e195e9f71362d4ce23a0db0f939fd0580a47" Jun 13 05:09:27 crc kubenswrapper[4894]: I0613 05:09:27.916592 4894 generic.go:334] "Generic (PLEG): container finished" podID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerID="1777129e8e8a33beb690e0c487643173eccbbdbb009a127f68facceba9c56f78" exitCode=0 Jun 13 05:09:27 crc kubenswrapper[4894]: I0613 05:09:27.916674 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerDied","Data":"1777129e8e8a33beb690e0c487643173eccbbdbb009a127f68facceba9c56f78"} Jun 13 05:09:27 crc kubenswrapper[4894]: I0613 05:09:27.917156 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5347c46f-ac9a-4ec1-bf62-29e88fb89033","Type":"ContainerDied","Data":"d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75"} Jun 13 05:09:27 crc kubenswrapper[4894]: I0613 05:09:27.917205 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e7fdc5e49671492a92ae73561e279437556b7c1feb80582f6c8b5d2036da75" Jun 13 05:09:27 crc kubenswrapper[4894]: I0613 05:09:27.949324 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.090442 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67fn\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.090480 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.090513 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.090572 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091563 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091624 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091661 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091732 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091786 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091862 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.091885 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie\") pod \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\" (UID: \"5347c46f-ac9a-4ec1-bf62-29e88fb89033\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.092077 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.092267 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.093643 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.096399 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.097280 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn" (OuterVolumeSpecName: "kube-api-access-v67fn") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "kube-api-access-v67fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098276 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098299 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098310 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67fn\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-kube-api-access-v67fn\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098334 4894 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098343 4894 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-plugins-conf\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.098843 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.108827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info" (OuterVolumeSpecName: "pod-info") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.112534 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.119628 4894 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.151954 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data" (OuterVolumeSpecName: "config-data") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.160978 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf" (OuterVolumeSpecName: "server-conf") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200381 4894 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-server-conf\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200412 4894 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5347c46f-ac9a-4ec1-bf62-29e88fb89033-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200423 4894 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5347c46f-ac9a-4ec1-bf62-29e88fb89033-pod-info\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200431 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200439 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5347c46f-ac9a-4ec1-bf62-29e88fb89033-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.200449 4894 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.207392 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5347c46f-ac9a-4ec1-bf62-29e88fb89033" (UID: "5347c46f-ac9a-4ec1-bf62-29e88fb89033"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.301802 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5347c46f-ac9a-4ec1-bf62-29e88fb89033-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.626756 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732106 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732707 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732762 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732782 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732824 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732893 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732914 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732951 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbm6\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732974 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.732999 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.733031 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf\") pod \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\" (UID: \"f7fe1e93-4c05-4293-b36d-d65c9cec93a2\") " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.733463 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.733568 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.734225 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.748847 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.750410 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.765258 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.776724 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info" (OuterVolumeSpecName: "pod-info") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.780839 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6" (OuterVolumeSpecName: "kube-api-access-6vbm6") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "kube-api-access-6vbm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.787004 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data" (OuterVolumeSpecName: "config-data") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.828184 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf" (OuterVolumeSpecName: "server-conf") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837838 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837874 4894 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-server-conf\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837883 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837894 4894 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-plugins-conf\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837928 4894 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837938 4894 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837946 4894 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-pod-info\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837957 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837964 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbm6\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-kube-api-access-6vbm6\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.837972 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.883185 4894 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.901193 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f7fe1e93-4c05-4293-b36d-d65c9cec93a2" (UID: "f7fe1e93-4c05-4293-b36d-d65c9cec93a2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.937116 4894 generic.go:334] "Generic (PLEG): container finished" podID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerID="b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58" exitCode=0 Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.938116 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.937561 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerDied","Data":"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58"} Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.938752 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f7fe1e93-4c05-4293-b36d-d65c9cec93a2","Type":"ContainerDied","Data":"4ec2a238d97949879d16f86bb45b20c81ae97eb58368d9ede1849064cb415c87"} Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.938778 4894 scope.go:117] "RemoveContainer" containerID="b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.937603 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.939066 4894 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7fe1e93-4c05-4293-b36d-d65c9cec93a2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.939091 4894 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.974600 4894 scope.go:117] "RemoveContainer" containerID="039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae" Jun 13 05:09:28 crc kubenswrapper[4894]: I0613 05:09:28.982464 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.004133 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.007150 4894 scope.go:117] "RemoveContainer" containerID="b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.013915 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58\": container with ID starting with b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58 not found: ID does not exist" containerID="b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.014130 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58"} err="failed to get container status \"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58\": rpc error: code = NotFound desc = could not find container \"b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58\": container with ID starting with b845b150ecbad2b840293f7f31d755638a9b47d32a7698e6acba9156e91cae58 not found: ID does not exist" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.014212 4894 scope.go:117] "RemoveContainer" containerID="039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.014689 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae\": container with ID starting with 039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae not found: ID does not exist" containerID="039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.014721 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae"} err="failed to get container status \"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae\": rpc error: code = NotFound desc = could not find container \"039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae\": container with ID starting with 039270b3889ab5b2da30708530ee9af651c53cdb9c861f8780946e2e0c1867ae not found: ID does not exist" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.017665 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.024583 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039051 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.039385 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="setup-container" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039400 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="setup-container" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.039426 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039432 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.039441 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" containerName="container-00" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039447 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" containerName="container-00" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.039457 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="setup-container" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039463 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="setup-container" Jun 13 05:09:29 crc kubenswrapper[4894]: E0613 05:09:29.039488 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039494 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039640 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039663 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf01f5e-66fb-4df8-979a-9f0b81c5ad18" containerName="container-00" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.039678 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" containerName="rabbitmq" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.040533 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.045985 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046181 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046316 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046424 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046546 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046711 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.046826 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gqlpx" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.049873 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.051283 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.056581 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7sczr" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.056821 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.056905 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.056940 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.057107 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.058364 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.058518 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.073465 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.073512 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143407 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143451 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5edbb43-096d-46e3-9e13-827e7eb51868-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143475 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5edbb43-096d-46e3-9e13-827e7eb51868-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143500 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143521 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143606 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143661 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143686 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143707 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143730 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143766 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143786 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbkkf\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-kube-api-access-gbkkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143905 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.143974 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144037 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76nt\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-kube-api-access-g76nt\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144116 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144155 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144207 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144222 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144252 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144291 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.144309 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246050 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246335 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246373 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246393 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246415 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246431 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246470 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246487 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbkkf\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-kube-api-access-gbkkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246509 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246528 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246551 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76nt\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-kube-api-access-g76nt\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246574 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246593 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246610 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246624 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246639 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246666 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246681 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246730 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5edbb43-096d-46e3-9e13-827e7eb51868-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246746 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5edbb43-096d-46e3-9e13-827e7eb51868-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246763 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.246928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247137 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247205 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247343 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247534 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247772 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.247906 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.248301 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.248776 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.252366 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.252805 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.253171 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.253722 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.254035 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.254644 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5edbb43-096d-46e3-9e13-827e7eb51868-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.255860 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5edbb43-096d-46e3-9e13-827e7eb51868-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.260511 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.261128 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5edbb43-096d-46e3-9e13-827e7eb51868-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.261889 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.263180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.265610 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76nt\" (UniqueName: \"kubernetes.io/projected/c5edbb43-096d-46e3-9e13-827e7eb51868-kube-api-access-g76nt\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.266449 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbkkf\" (UniqueName: \"kubernetes.io/projected/c2d6cfd6-2bbf-4bcc-a837-28ab5958af73-kube-api-access-gbkkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.288762 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73\") " pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.292408 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5edbb43-096d-46e3-9e13-827e7eb51868\") " pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.388929 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.392628 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:09:29 crc kubenswrapper[4894]: I0613 05:09:29.954216 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.058960 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jun 13 05:09:30 crc kubenswrapper[4894]: W0613 05:09:30.067022 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5edbb43_096d_46e3_9e13_827e7eb51868.slice/crio-68e930790dfe92ed1ef947f7b0421b9a173027f3657069c2b6a43ad6e1ca5fc7 WatchSource:0}: Error finding container 68e930790dfe92ed1ef947f7b0421b9a173027f3657069c2b6a43ad6e1ca5fc7: Status 404 returned error can't find the container with id 68e930790dfe92ed1ef947f7b0421b9a173027f3657069c2b6a43ad6e1ca5fc7 Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.288569 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5347c46f-ac9a-4ec1-bf62-29e88fb89033" path="/var/lib/kubelet/pods/5347c46f-ac9a-4ec1-bf62-29e88fb89033/volumes" Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.289964 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7fe1e93-4c05-4293-b36d-d65c9cec93a2" path="/var/lib/kubelet/pods/f7fe1e93-4c05-4293-b36d-d65c9cec93a2/volumes" Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.963573 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5edbb43-096d-46e3-9e13-827e7eb51868","Type":"ContainerStarted","Data":"9094c7d45fe5b5f71840ec978072dab98bcfd89513dae9b6ef16e7404ac5f425"} Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.963893 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5edbb43-096d-46e3-9e13-827e7eb51868","Type":"ContainerStarted","Data":"68e930790dfe92ed1ef947f7b0421b9a173027f3657069c2b6a43ad6e1ca5fc7"} Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.967620 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73","Type":"ContainerStarted","Data":"e9c2923c8de4f81460bb26601ff10ee3ba78df61b39ee2357a65a59dc389f6d3"} Jun 13 05:09:30 crc kubenswrapper[4894]: I0613 05:09:30.967680 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73","Type":"ContainerStarted","Data":"a833242877ca9f1f47d395887a9e9eb804296df5148a5d1b8a1b8b3719ee2c2c"} Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.851376 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.853486 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.857083 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.863563 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933717 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933753 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933828 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933866 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwdp\" (UniqueName: \"kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933892 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:32 crc kubenswrapper[4894]: I0613 05:09:32.933979 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035021 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035088 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwdp\" (UniqueName: \"kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035110 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035163 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035210 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.035226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.036203 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.036632 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.036863 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.037449 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.038987 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.066598 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwdp\" (UniqueName: \"kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp\") pod \"dnsmasq-dns-7c7fc5d5-rt74f\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.183297 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:33 crc kubenswrapper[4894]: I0613 05:09:33.668316 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:33 crc kubenswrapper[4894]: W0613 05:09:33.672978 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9a70e7_ace0_4a81_aa34_51974a540f20.slice/crio-b285811e9b5aca202b6ac277b5e52d9719bb17c671a9059b8a0185c049522911 WatchSource:0}: Error finding container b285811e9b5aca202b6ac277b5e52d9719bb17c671a9059b8a0185c049522911: Status 404 returned error can't find the container with id b285811e9b5aca202b6ac277b5e52d9719bb17c671a9059b8a0185c049522911 Jun 13 05:09:34 crc kubenswrapper[4894]: I0613 05:09:34.005386 4894 generic.go:334] "Generic (PLEG): container finished" podID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerID="21b45541ffca1571d1c4aed5a4c4d7548f1b5ab4d276c40f7ee6f7c48bf14e3e" exitCode=0 Jun 13 05:09:34 crc kubenswrapper[4894]: I0613 05:09:34.005438 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" event={"ID":"bc9a70e7-ace0-4a81-aa34-51974a540f20","Type":"ContainerDied","Data":"21b45541ffca1571d1c4aed5a4c4d7548f1b5ab4d276c40f7ee6f7c48bf14e3e"} Jun 13 05:09:34 crc kubenswrapper[4894]: I0613 05:09:34.005463 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" event={"ID":"bc9a70e7-ace0-4a81-aa34-51974a540f20","Type":"ContainerStarted","Data":"b285811e9b5aca202b6ac277b5e52d9719bb17c671a9059b8a0185c049522911"} Jun 13 05:09:35 crc kubenswrapper[4894]: I0613 05:09:35.023310 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" event={"ID":"bc9a70e7-ace0-4a81-aa34-51974a540f20","Type":"ContainerStarted","Data":"2adce3adc540d15d03e9d9b106dd18f16932bcd82984d23bcd2d77d3b5707b6f"} Jun 13 05:09:35 crc kubenswrapper[4894]: I0613 05:09:35.023734 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:35 crc kubenswrapper[4894]: I0613 05:09:35.055824 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" podStartSLOduration=3.055801121 podStartE2EDuration="3.055801121s" podCreationTimestamp="2025-06-13 05:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:09:35.055465971 +0000 UTC m=+1133.501713464" watchObservedRunningTime="2025-06-13 05:09:35.055801121 +0000 UTC m=+1133.502048614" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.184900 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.256882 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.257137 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="dnsmasq-dns" containerID="cri-o://9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691" gracePeriod=10 Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.586030 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.597056 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.609209 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666693 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666761 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666788 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666836 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666856 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5tm\" (UniqueName: \"kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.666904 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.703518 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.768429 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb\") pod \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.768496 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config\") pod \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.768537 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lkt\" (UniqueName: \"kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt\") pod \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.768783 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc\") pod \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.768818 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb\") pod \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\" (UID: \"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311\") " Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769070 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769118 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769145 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769179 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769198 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5tm\" (UniqueName: \"kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.769245 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.770117 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.771372 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.774867 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.774871 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.786085 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.789728 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt" (OuterVolumeSpecName: "kube-api-access-k9lkt") pod "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" (UID: "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311"). InnerVolumeSpecName "kube-api-access-k9lkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.793339 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5tm\" (UniqueName: \"kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm\") pod \"dnsmasq-dns-59d8754cc-bjfcp\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.817295 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" (UID: "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.821083 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" (UID: "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.845087 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config" (OuterVolumeSpecName: "config") pod "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" (UID: "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.852593 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" (UID: "7d073d91-0f60-4ff1-a4eb-6c98a1d0a311"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.870975 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.870999 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lkt\" (UniqueName: \"kubernetes.io/projected/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-kube-api-access-k9lkt\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.871010 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.871019 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.871029 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:43 crc kubenswrapper[4894]: I0613 05:09:43.997940 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.135292 4894 generic.go:334] "Generic (PLEG): container finished" podID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerID="9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691" exitCode=0 Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.135634 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" event={"ID":"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311","Type":"ContainerDied","Data":"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691"} Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.135804 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" event={"ID":"7d073d91-0f60-4ff1-a4eb-6c98a1d0a311","Type":"ContainerDied","Data":"7d7cf133e4a23db2071bd4a967c6eef1922c96114c72238f9532b687bbab7123"} Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.135720 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746fb47d4f-4w269" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.135840 4894 scope.go:117] "RemoveContainer" containerID="9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.186437 4894 scope.go:117] "RemoveContainer" containerID="97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.195755 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.198174 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-746fb47d4f-4w269"] Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.227526 4894 scope.go:117] "RemoveContainer" containerID="9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691" Jun 13 05:09:44 crc kubenswrapper[4894]: E0613 05:09:44.228339 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691\": container with ID starting with 9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691 not found: ID does not exist" containerID="9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.228369 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691"} err="failed to get container status \"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691\": rpc error: code = NotFound desc = could not find container \"9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691\": container with ID starting with 9ed20d0ebe4173fcf50d0ed91389afb0d7b2e5351eed6deba9cf409e10781691 not found: ID does not exist" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.228389 4894 scope.go:117] "RemoveContainer" containerID="97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f" Jun 13 05:09:44 crc kubenswrapper[4894]: E0613 05:09:44.229137 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f\": container with ID starting with 97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f not found: ID does not exist" containerID="97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.229159 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f"} err="failed to get container status \"97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f\": rpc error: code = NotFound desc = could not find container \"97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f\": container with ID starting with 97f5d3b00d5dbfc6dbc809898e242d7513810350da5053a2c1fb5bb3d69fd54f not found: ID does not exist" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.286514 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" path="/var/lib/kubelet/pods/7d073d91-0f60-4ff1-a4eb-6c98a1d0a311/volumes" Jun 13 05:09:44 crc kubenswrapper[4894]: I0613 05:09:44.538691 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:09:45 crc kubenswrapper[4894]: I0613 05:09:45.146019 4894 generic.go:334] "Generic (PLEG): container finished" podID="f691414e-759b-4a4b-808b-2f079c051452" containerID="30a1f810122c81cd07847844354f32feb1a8c600c924afb2c6e12a144f859eac" exitCode=0 Jun 13 05:09:45 crc kubenswrapper[4894]: I0613 05:09:45.146533 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" event={"ID":"f691414e-759b-4a4b-808b-2f079c051452","Type":"ContainerDied","Data":"30a1f810122c81cd07847844354f32feb1a8c600c924afb2c6e12a144f859eac"} Jun 13 05:09:45 crc kubenswrapper[4894]: I0613 05:09:45.146565 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" event={"ID":"f691414e-759b-4a4b-808b-2f079c051452","Type":"ContainerStarted","Data":"3844a338da683657d85fcef01033ecd71f61a30ce9c34dd24b0e50b7e2a4b0d7"} Jun 13 05:09:46 crc kubenswrapper[4894]: I0613 05:09:46.166753 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" event={"ID":"f691414e-759b-4a4b-808b-2f079c051452","Type":"ContainerStarted","Data":"886fbaa3b82a6e72a2127379cf3549221d8674c5af04fabc32456ff527d61166"} Jun 13 05:09:46 crc kubenswrapper[4894]: I0613 05:09:46.167214 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:46 crc kubenswrapper[4894]: I0613 05:09:46.195164 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" podStartSLOduration=3.195142722 podStartE2EDuration="3.195142722s" podCreationTimestamp="2025-06-13 05:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:09:46.193241878 +0000 UTC m=+1144.639489371" watchObservedRunningTime="2025-06-13 05:09:46.195142722 +0000 UTC m=+1144.641390195" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:53.999528 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.091412 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.091924 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="dnsmasq-dns" containerID="cri-o://2adce3adc540d15d03e9d9b106dd18f16932bcd82984d23bcd2d77d3b5707b6f" gracePeriod=10 Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.265178 4894 generic.go:334] "Generic (PLEG): container finished" podID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerID="2adce3adc540d15d03e9d9b106dd18f16932bcd82984d23bcd2d77d3b5707b6f" exitCode=0 Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.265484 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" event={"ID":"bc9a70e7-ace0-4a81-aa34-51974a540f20","Type":"ContainerDied","Data":"2adce3adc540d15d03e9d9b106dd18f16932bcd82984d23bcd2d77d3b5707b6f"} Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.544777 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.715143 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.715587 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwdp\" (UniqueName: \"kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.715830 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.715948 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.716042 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.716143 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc\") pod \"bc9a70e7-ace0-4a81-aa34-51974a540f20\" (UID: \"bc9a70e7-ace0-4a81-aa34-51974a540f20\") " Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.738187 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp" (OuterVolumeSpecName: "kube-api-access-2bwdp") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "kube-api-access-2bwdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.767379 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.777715 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.782302 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.791963 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.798438 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config" (OuterVolumeSpecName: "config") pod "bc9a70e7-ace0-4a81-aa34-51974a540f20" (UID: "bc9a70e7-ace0-4a81-aa34-51974a540f20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821456 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821489 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821501 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821512 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821520 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9a70e7-ace0-4a81-aa34-51974a540f20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:54 crc kubenswrapper[4894]: I0613 05:09:54.821529 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwdp\" (UniqueName: \"kubernetes.io/projected/bc9a70e7-ace0-4a81-aa34-51974a540f20-kube-api-access-2bwdp\") on node \"crc\" DevicePath \"\"" Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.274558 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" event={"ID":"bc9a70e7-ace0-4a81-aa34-51974a540f20","Type":"ContainerDied","Data":"b285811e9b5aca202b6ac277b5e52d9719bb17c671a9059b8a0185c049522911"} Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.274624 4894 scope.go:117] "RemoveContainer" containerID="2adce3adc540d15d03e9d9b106dd18f16932bcd82984d23bcd2d77d3b5707b6f" Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.274623 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fc5d5-rt74f" Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.294787 4894 scope.go:117] "RemoveContainer" containerID="21b45541ffca1571d1c4aed5a4c4d7548f1b5ab4d276c40f7ee6f7c48bf14e3e" Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.311947 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:55 crc kubenswrapper[4894]: I0613 05:09:55.320038 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c7fc5d5-rt74f"] Jun 13 05:09:56 crc kubenswrapper[4894]: I0613 05:09:56.297448 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" path="/var/lib/kubelet/pods/bc9a70e7-ace0-4a81-aa34-51974a540f20/volumes" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.732267 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt"] Jun 13 05:09:59 crc kubenswrapper[4894]: E0613 05:09:59.733101 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733120 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: E0613 05:09:59.733145 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="init" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733158 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="init" Jun 13 05:09:59 crc kubenswrapper[4894]: E0613 05:09:59.733197 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733210 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: E0613 05:09:59.733243 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="init" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733255 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="init" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733547 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9a70e7-ace0-4a81-aa34-51974a540f20" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.733575 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d073d91-0f60-4ff1-a4eb-6c98a1d0a311" containerName="dnsmasq-dns" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.734498 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.737479 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.737775 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.740619 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.740907 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.755428 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt"] Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.820008 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.820128 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.820175 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9blf\" (UniqueName: \"kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.820218 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.921748 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.921830 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9blf\" (UniqueName: \"kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.921893 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.922092 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.932746 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.933376 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.936973 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:09:59 crc kubenswrapper[4894]: I0613 05:09:59.945970 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9blf\" (UniqueName: \"kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:10:00 crc kubenswrapper[4894]: I0613 05:10:00.102072 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:10:00 crc kubenswrapper[4894]: I0613 05:10:00.742523 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt"] Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.343786 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" event={"ID":"e26dfa3d-1615-40aa-9ede-01780eeaf5d9","Type":"ContainerStarted","Data":"ebd622d3c64efcd387b91ae88a64c0c713dba91d59c2fd2b9f4a9157606b43a7"} Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.346818 4894 generic.go:334] "Generic (PLEG): container finished" podID="c2d6cfd6-2bbf-4bcc-a837-28ab5958af73" containerID="e9c2923c8de4f81460bb26601ff10ee3ba78df61b39ee2357a65a59dc389f6d3" exitCode=0 Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.346920 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73","Type":"ContainerDied","Data":"e9c2923c8de4f81460bb26601ff10ee3ba78df61b39ee2357a65a59dc389f6d3"} Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.350958 4894 generic.go:334] "Generic (PLEG): container finished" podID="c5edbb43-096d-46e3-9e13-827e7eb51868" containerID="9094c7d45fe5b5f71840ec978072dab98bcfd89513dae9b6ef16e7404ac5f425" exitCode=0 Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.350987 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5edbb43-096d-46e3-9e13-827e7eb51868","Type":"ContainerDied","Data":"9094c7d45fe5b5f71840ec978072dab98bcfd89513dae9b6ef16e7404ac5f425"} Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.967515 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-mdds9"] Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.970126 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mdds9" Jun 13 05:10:01 crc kubenswrapper[4894]: I0613 05:10:01.974083 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.054301 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.054441 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfcd\" (UniqueName: \"kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.155932 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.156009 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfcd\" (UniqueName: \"kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.156399 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.195347 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfcd\" (UniqueName: \"kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd\") pod \"crc-debug-mdds9\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.290188 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mdds9" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.384599 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mdds9" event={"ID":"f543dc73-216d-44d2-92a5-41bfe12d645a","Type":"ContainerStarted","Data":"325bf964a7a644423040fc289351bc5fd9aa8d9198769e62db96847bded975ec"} Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.388358 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5edbb43-096d-46e3-9e13-827e7eb51868","Type":"ContainerStarted","Data":"f21ca4d61133f9ba823037a3cec19957cc80d21ccc72f1950431267a65af4e24"} Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.388606 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.390795 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d6cfd6-2bbf-4bcc-a837-28ab5958af73","Type":"ContainerStarted","Data":"b24dda28af355f0a883f44f18d1ab5bc4418a54495a9875a8d9af093f247a652"} Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.391274 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:10:02 crc kubenswrapper[4894]: I0613 05:10:02.421972 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.421955666 podStartE2EDuration="34.421955666s" podCreationTimestamp="2025-06-13 05:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:10:02.415873623 +0000 UTC m=+1160.862121106" watchObservedRunningTime="2025-06-13 05:10:02.421955666 +0000 UTC m=+1160.868203129" Jun 13 05:10:03 crc kubenswrapper[4894]: I0613 05:10:03.405702 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mdds9" event={"ID":"f543dc73-216d-44d2-92a5-41bfe12d645a","Type":"ContainerStarted","Data":"69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7"} Jun 13 05:10:03 crc kubenswrapper[4894]: I0613 05:10:03.430066 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-mdds9" podStartSLOduration=2.430048099 podStartE2EDuration="2.430048099s" podCreationTimestamp="2025-06-13 05:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:10:03.420022455 +0000 UTC m=+1161.866269918" watchObservedRunningTime="2025-06-13 05:10:03.430048099 +0000 UTC m=+1161.876295562" Jun 13 05:10:03 crc kubenswrapper[4894]: I0613 05:10:03.430486 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.430478081 podStartE2EDuration="35.430478081s" podCreationTimestamp="2025-06-13 05:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:10:02.471704223 +0000 UTC m=+1160.917951686" watchObservedRunningTime="2025-06-13 05:10:03.430478081 +0000 UTC m=+1161.876725534" Jun 13 05:10:06 crc kubenswrapper[4894]: I0613 05:10:06.872052 4894 scope.go:117] "RemoveContainer" containerID="597c737a6ceae50c71e16da94b68f9fae3b11b7790de42f254bc4ad65fbd9c53" Jun 13 05:10:11 crc kubenswrapper[4894]: I0613 05:10:11.481321 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" event={"ID":"e26dfa3d-1615-40aa-9ede-01780eeaf5d9","Type":"ContainerStarted","Data":"69ee6700749dd71e2e81c635d15c667c783e272d4e390ab145e605c458ab0024"} Jun 13 05:10:12 crc kubenswrapper[4894]: I0613 05:10:12.824964 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" podStartSLOduration=4.246679422 podStartE2EDuration="13.824946381s" podCreationTimestamp="2025-06-13 05:09:59 +0000 UTC" firstStartedPulling="2025-06-13 05:10:00.759725662 +0000 UTC m=+1159.205973135" lastFinishedPulling="2025-06-13 05:10:10.337992631 +0000 UTC m=+1168.784240094" observedRunningTime="2025-06-13 05:10:11.513570135 +0000 UTC m=+1169.959817638" watchObservedRunningTime="2025-06-13 05:10:12.824946381 +0000 UTC m=+1171.271193834" Jun 13 05:10:12 crc kubenswrapper[4894]: I0613 05:10:12.828218 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-mdds9"] Jun 13 05:10:12 crc kubenswrapper[4894]: I0613 05:10:12.828402 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-mdds9" podUID="f543dc73-216d-44d2-92a5-41bfe12d645a" containerName="container-00" containerID="cri-o://69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7" gracePeriod=2 Jun 13 05:10:12 crc kubenswrapper[4894]: I0613 05:10:12.838378 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-mdds9"] Jun 13 05:10:12 crc kubenswrapper[4894]: I0613 05:10:12.943429 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mdds9" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.070263 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhfcd\" (UniqueName: \"kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd\") pod \"f543dc73-216d-44d2-92a5-41bfe12d645a\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.070442 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host\") pod \"f543dc73-216d-44d2-92a5-41bfe12d645a\" (UID: \"f543dc73-216d-44d2-92a5-41bfe12d645a\") " Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.070479 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host" (OuterVolumeSpecName: "host") pod "f543dc73-216d-44d2-92a5-41bfe12d645a" (UID: "f543dc73-216d-44d2-92a5-41bfe12d645a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.071012 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f543dc73-216d-44d2-92a5-41bfe12d645a-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.088248 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd" (OuterVolumeSpecName: "kube-api-access-qhfcd") pod "f543dc73-216d-44d2-92a5-41bfe12d645a" (UID: "f543dc73-216d-44d2-92a5-41bfe12d645a"). InnerVolumeSpecName "kube-api-access-qhfcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.172894 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhfcd\" (UniqueName: \"kubernetes.io/projected/f543dc73-216d-44d2-92a5-41bfe12d645a-kube-api-access-qhfcd\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.502485 4894 generic.go:334] "Generic (PLEG): container finished" podID="f543dc73-216d-44d2-92a5-41bfe12d645a" containerID="69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7" exitCode=0 Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.502848 4894 scope.go:117] "RemoveContainer" containerID="69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.502915 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mdds9" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.534091 4894 scope.go:117] "RemoveContainer" containerID="69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7" Jun 13 05:10:13 crc kubenswrapper[4894]: E0613 05:10:13.534494 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7\": container with ID starting with 69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7 not found: ID does not exist" containerID="69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7" Jun 13 05:10:13 crc kubenswrapper[4894]: I0613 05:10:13.534543 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7"} err="failed to get container status \"69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7\": rpc error: code = NotFound desc = could not find container \"69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7\": container with ID starting with 69bf4b54feef2599a9f2f75e6a0cbf1acd8a3ceb801572ae2ca00ce1963c7aa7 not found: ID does not exist" Jun 13 05:10:14 crc kubenswrapper[4894]: I0613 05:10:14.292382 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f543dc73-216d-44d2-92a5-41bfe12d645a" path="/var/lib/kubelet/pods/f543dc73-216d-44d2-92a5-41bfe12d645a/volumes" Jun 13 05:10:19 crc kubenswrapper[4894]: I0613 05:10:19.392894 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jun 13 05:10:19 crc kubenswrapper[4894]: I0613 05:10:19.400299 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jun 13 05:10:23 crc kubenswrapper[4894]: I0613 05:10:23.624330 4894 generic.go:334] "Generic (PLEG): container finished" podID="e26dfa3d-1615-40aa-9ede-01780eeaf5d9" containerID="69ee6700749dd71e2e81c635d15c667c783e272d4e390ab145e605c458ab0024" exitCode=0 Jun 13 05:10:23 crc kubenswrapper[4894]: I0613 05:10:23.624374 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" event={"ID":"e26dfa3d-1615-40aa-9ede-01780eeaf5d9","Type":"ContainerDied","Data":"69ee6700749dd71e2e81c635d15c667c783e272d4e390ab145e605c458ab0024"} Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.231941 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.303828 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory\") pod \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.304009 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key\") pod \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.304897 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9blf\" (UniqueName: \"kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf\") pod \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.304960 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle\") pod \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\" (UID: \"e26dfa3d-1615-40aa-9ede-01780eeaf5d9\") " Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.310623 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e26dfa3d-1615-40aa-9ede-01780eeaf5d9" (UID: "e26dfa3d-1615-40aa-9ede-01780eeaf5d9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.310687 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf" (OuterVolumeSpecName: "kube-api-access-m9blf") pod "e26dfa3d-1615-40aa-9ede-01780eeaf5d9" (UID: "e26dfa3d-1615-40aa-9ede-01780eeaf5d9"). InnerVolumeSpecName "kube-api-access-m9blf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.331914 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory" (OuterVolumeSpecName: "inventory") pod "e26dfa3d-1615-40aa-9ede-01780eeaf5d9" (UID: "e26dfa3d-1615-40aa-9ede-01780eeaf5d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.346036 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e26dfa3d-1615-40aa-9ede-01780eeaf5d9" (UID: "e26dfa3d-1615-40aa-9ede-01780eeaf5d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.406876 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.407110 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.407180 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9blf\" (UniqueName: \"kubernetes.io/projected/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-kube-api-access-m9blf\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.407322 4894 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26dfa3d-1615-40aa-9ede-01780eeaf5d9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.647264 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" event={"ID":"e26dfa3d-1615-40aa-9ede-01780eeaf5d9","Type":"ContainerDied","Data":"ebd622d3c64efcd387b91ae88a64c0c713dba91d59c2fd2b9f4a9157606b43a7"} Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.647302 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.647318 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd622d3c64efcd387b91ae88a64c0c713dba91d59c2fd2b9f4a9157606b43a7" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.739669 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw"] Jun 13 05:10:25 crc kubenswrapper[4894]: E0613 05:10:25.740171 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26dfa3d-1615-40aa-9ede-01780eeaf5d9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.740187 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26dfa3d-1615-40aa-9ede-01780eeaf5d9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:10:25 crc kubenswrapper[4894]: E0613 05:10:25.740219 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f543dc73-216d-44d2-92a5-41bfe12d645a" containerName="container-00" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.740225 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f543dc73-216d-44d2-92a5-41bfe12d645a" containerName="container-00" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.740370 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26dfa3d-1615-40aa-9ede-01780eeaf5d9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.740389 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f543dc73-216d-44d2-92a5-41bfe12d645a" containerName="container-00" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.740903 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.743820 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.752081 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.752268 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw"] Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.752485 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.752631 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.814196 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svn5z\" (UniqueName: \"kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.814842 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.815007 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.815142 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.918093 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.918368 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svn5z\" (UniqueName: \"kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.918415 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.918466 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.923328 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.924435 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.925199 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:25 crc kubenswrapper[4894]: I0613 05:10:25.935148 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svn5z\" (UniqueName: \"kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:26 crc kubenswrapper[4894]: I0613 05:10:26.063980 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:10:26 crc kubenswrapper[4894]: I0613 05:10:26.696259 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw"] Jun 13 05:10:26 crc kubenswrapper[4894]: W0613 05:10:26.710707 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc05db1ce_0491_40b6_a148_be6b414542bc.slice/crio-86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2 WatchSource:0}: Error finding container 86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2: Status 404 returned error can't find the container with id 86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2 Jun 13 05:10:27 crc kubenswrapper[4894]: I0613 05:10:27.673304 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" event={"ID":"c05db1ce-0491-40b6-a148-be6b414542bc","Type":"ContainerStarted","Data":"7fec164af3701d73e78da5c63e2a0bd8b9db87d7d559a397fb0fa9fac82c7ca9"} Jun 13 05:10:27 crc kubenswrapper[4894]: I0613 05:10:27.673843 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" event={"ID":"c05db1ce-0491-40b6-a148-be6b414542bc","Type":"ContainerStarted","Data":"86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2"} Jun 13 05:10:27 crc kubenswrapper[4894]: I0613 05:10:27.700889 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" podStartSLOduration=2.165305434 podStartE2EDuration="2.700861738s" podCreationTimestamp="2025-06-13 05:10:25 +0000 UTC" firstStartedPulling="2025-06-13 05:10:26.714298644 +0000 UTC m=+1185.160546137" lastFinishedPulling="2025-06-13 05:10:27.249854978 +0000 UTC m=+1185.696102441" observedRunningTime="2025-06-13 05:10:27.692285285 +0000 UTC m=+1186.138532758" watchObservedRunningTime="2025-06-13 05:10:27.700861738 +0000 UTC m=+1186.147109211" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.213038 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-zvq4x"] Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.215877 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.221135 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.335451 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz95c\" (UniqueName: \"kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.335723 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.437633 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz95c\" (UniqueName: \"kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.437731 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.438172 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.459126 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz95c\" (UniqueName: \"kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c\") pod \"crc-debug-zvq4x\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " pod="openstack/crc-debug-zvq4x" Jun 13 05:11:02 crc kubenswrapper[4894]: I0613 05:11:02.555523 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-zvq4x" Jun 13 05:11:03 crc kubenswrapper[4894]: I0613 05:11:03.086903 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-zvq4x" event={"ID":"88dc6eb8-dd8c-4f29-992c-8e346ebb9812","Type":"ContainerStarted","Data":"ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1"} Jun 13 05:11:03 crc kubenswrapper[4894]: I0613 05:11:03.087318 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-zvq4x" event={"ID":"88dc6eb8-dd8c-4f29-992c-8e346ebb9812","Type":"ContainerStarted","Data":"90896f066e52c7b064e3e458f7dd9378e441aa4fea89c8c08a7584ad59bc163c"} Jun 13 05:11:03 crc kubenswrapper[4894]: I0613 05:11:03.106308 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-zvq4x" podStartSLOduration=1.106293488 podStartE2EDuration="1.106293488s" podCreationTimestamp="2025-06-13 05:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:11:03.101847052 +0000 UTC m=+1221.548094545" watchObservedRunningTime="2025-06-13 05:11:03.106293488 +0000 UTC m=+1221.552540951" Jun 13 05:11:10 crc kubenswrapper[4894]: I0613 05:11:10.356836 4894 scope.go:117] "RemoveContainer" containerID="c39332d60ca356dea2df67b7e64412f3b8d72505e36188b7f690e8ce86664439" Jun 13 05:11:10 crc kubenswrapper[4894]: I0613 05:11:10.438147 4894 scope.go:117] "RemoveContainer" containerID="1777129e8e8a33beb690e0c487643173eccbbdbb009a127f68facceba9c56f78" Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.237940 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-zvq4x"] Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.238690 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-zvq4x" podUID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" containerName="container-00" containerID="cri-o://ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1" gracePeriod=2 Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.245303 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-zvq4x"] Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.330093 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-zvq4x" Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.466475 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz95c\" (UniqueName: \"kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c\") pod \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.466620 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host\") pod \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\" (UID: \"88dc6eb8-dd8c-4f29-992c-8e346ebb9812\") " Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.468122 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host" (OuterVolumeSpecName: "host") pod "88dc6eb8-dd8c-4f29-992c-8e346ebb9812" (UID: "88dc6eb8-dd8c-4f29-992c-8e346ebb9812"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.475631 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c" (OuterVolumeSpecName: "kube-api-access-jz95c") pod "88dc6eb8-dd8c-4f29-992c-8e346ebb9812" (UID: "88dc6eb8-dd8c-4f29-992c-8e346ebb9812"). InnerVolumeSpecName "kube-api-access-jz95c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.570015 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz95c\" (UniqueName: \"kubernetes.io/projected/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-kube-api-access-jz95c\") on node \"crc\" DevicePath \"\"" Jun 13 05:11:13 crc kubenswrapper[4894]: I0613 05:11:13.570368 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/88dc6eb8-dd8c-4f29-992c-8e346ebb9812-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.218484 4894 generic.go:334] "Generic (PLEG): container finished" podID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" containerID="ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1" exitCode=0 Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.218532 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-zvq4x" Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.218566 4894 scope.go:117] "RemoveContainer" containerID="ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1" Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.247124 4894 scope.go:117] "RemoveContainer" containerID="ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1" Jun 13 05:11:14 crc kubenswrapper[4894]: E0613 05:11:14.249609 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1\": container with ID starting with ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1 not found: ID does not exist" containerID="ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1" Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.249670 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1"} err="failed to get container status \"ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1\": rpc error: code = NotFound desc = could not find container \"ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1\": container with ID starting with ba5ef4d570a84bab149f8ed8fa8bbbfba38b8cc421673c45159d3ee6fae702b1 not found: ID does not exist" Jun 13 05:11:14 crc kubenswrapper[4894]: I0613 05:11:14.293269 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" path="/var/lib/kubelet/pods/88dc6eb8-dd8c-4f29-992c-8e346ebb9812/volumes" Jun 13 05:11:26 crc kubenswrapper[4894]: I0613 05:11:26.236625 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:11:26 crc kubenswrapper[4894]: I0613 05:11:26.237190 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:11:56 crc kubenswrapper[4894]: I0613 05:11:56.238765 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:11:56 crc kubenswrapper[4894]: I0613 05:11:56.240119 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.630922 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-948hs"] Jun 13 05:12:01 crc kubenswrapper[4894]: E0613 05:12:01.631954 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" containerName="container-00" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.631974 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" containerName="container-00" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.632332 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dc6eb8-dd8c-4f29-992c-8e346ebb9812" containerName="container-00" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.633223 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.636718 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.773725 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gfb\" (UniqueName: \"kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.774147 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.876044 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.876199 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gfb\" (UniqueName: \"kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.876432 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.908454 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gfb\" (UniqueName: \"kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb\") pod \"crc-debug-948hs\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " pod="openstack/crc-debug-948hs" Jun 13 05:12:01 crc kubenswrapper[4894]: I0613 05:12:01.967432 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-948hs" Jun 13 05:12:02 crc kubenswrapper[4894]: I0613 05:12:02.762959 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-948hs" event={"ID":"351d8816-bf2c-45fc-bc9e-37345587ac8c","Type":"ContainerStarted","Data":"5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9"} Jun 13 05:12:02 crc kubenswrapper[4894]: I0613 05:12:02.763263 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-948hs" event={"ID":"351d8816-bf2c-45fc-bc9e-37345587ac8c","Type":"ContainerStarted","Data":"21a9f89188dbdf425997e78190d435bf89b104fc68f8c19c1db1ff1ae4c12b94"} Jun 13 05:12:02 crc kubenswrapper[4894]: I0613 05:12:02.791509 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-948hs" podStartSLOduration=1.79147984 podStartE2EDuration="1.79147984s" podCreationTimestamp="2025-06-13 05:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:12:02.779746998 +0000 UTC m=+1281.225994501" watchObservedRunningTime="2025-06-13 05:12:02.79147984 +0000 UTC m=+1281.237727343" Jun 13 05:12:10 crc kubenswrapper[4894]: I0613 05:12:10.511478 4894 scope.go:117] "RemoveContainer" containerID="e69437542148f0d36daf735f854c50a5c721f0064f55e0a73fd7af808aa6c24d" Jun 13 05:12:10 crc kubenswrapper[4894]: I0613 05:12:10.558311 4894 scope.go:117] "RemoveContainer" containerID="696f501fc60c5500010487a14b6bc383bad97386109a2590af76f8e14abe1b9a" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.590821 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-948hs"] Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.592124 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-948hs" podUID="351d8816-bf2c-45fc-bc9e-37345587ac8c" containerName="container-00" containerID="cri-o://5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9" gracePeriod=2 Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.606433 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-948hs"] Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.682829 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-948hs" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.794708 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host\") pod \"351d8816-bf2c-45fc-bc9e-37345587ac8c\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.794779 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2gfb\" (UniqueName: \"kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb\") pod \"351d8816-bf2c-45fc-bc9e-37345587ac8c\" (UID: \"351d8816-bf2c-45fc-bc9e-37345587ac8c\") " Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.795213 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host" (OuterVolumeSpecName: "host") pod "351d8816-bf2c-45fc-bc9e-37345587ac8c" (UID: "351d8816-bf2c-45fc-bc9e-37345587ac8c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.796933 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/351d8816-bf2c-45fc-bc9e-37345587ac8c-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.801916 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb" (OuterVolumeSpecName: "kube-api-access-c2gfb") pod "351d8816-bf2c-45fc-bc9e-37345587ac8c" (UID: "351d8816-bf2c-45fc-bc9e-37345587ac8c"). InnerVolumeSpecName "kube-api-access-c2gfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.871137 4894 generic.go:334] "Generic (PLEG): container finished" podID="351d8816-bf2c-45fc-bc9e-37345587ac8c" containerID="5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9" exitCode=0 Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.871196 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-948hs" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.871232 4894 scope.go:117] "RemoveContainer" containerID="5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.898124 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2gfb\" (UniqueName: \"kubernetes.io/projected/351d8816-bf2c-45fc-bc9e-37345587ac8c-kube-api-access-c2gfb\") on node \"crc\" DevicePath \"\"" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.903483 4894 scope.go:117] "RemoveContainer" containerID="5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9" Jun 13 05:12:12 crc kubenswrapper[4894]: E0613 05:12:12.904136 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9\": container with ID starting with 5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9 not found: ID does not exist" containerID="5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9" Jun 13 05:12:12 crc kubenswrapper[4894]: I0613 05:12:12.904877 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9"} err="failed to get container status \"5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9\": rpc error: code = NotFound desc = could not find container \"5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9\": container with ID starting with 5c4ed8d8d73922f5bc03bc147033343ed054bdc37046b042ef02b0d041ddacc9 not found: ID does not exist" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.293183 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351d8816-bf2c-45fc-bc9e-37345587ac8c" path="/var/lib/kubelet/pods/351d8816-bf2c-45fc-bc9e-37345587ac8c/volumes" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.893442 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:14 crc kubenswrapper[4894]: E0613 05:12:14.893900 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351d8816-bf2c-45fc-bc9e-37345587ac8c" containerName="container-00" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.893926 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="351d8816-bf2c-45fc-bc9e-37345587ac8c" containerName="container-00" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.894233 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="351d8816-bf2c-45fc-bc9e-37345587ac8c" containerName="container-00" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.899337 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.928003 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.952187 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.952252 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h56kq\" (UniqueName: \"kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:14 crc kubenswrapper[4894]: I0613 05:12:14.952281 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.053285 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.053358 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h56kq\" (UniqueName: \"kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.053381 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.054028 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.054689 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.087385 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h56kq\" (UniqueName: \"kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq\") pod \"certified-operators-z8sw6\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.228031 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.748092 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:15 crc kubenswrapper[4894]: I0613 05:12:15.903925 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerStarted","Data":"9b48a3b5b1f6f9cda841938e6e38c33c17fff6dbe5902b597c4ae7bf33c815ec"} Jun 13 05:12:16 crc kubenswrapper[4894]: I0613 05:12:16.933099 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fc19190-399d-4294-9672-423be9b7696f" containerID="b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9" exitCode=0 Jun 13 05:12:16 crc kubenswrapper[4894]: I0613 05:12:16.933941 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerDied","Data":"b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9"} Jun 13 05:12:17 crc kubenswrapper[4894]: I0613 05:12:17.946762 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerStarted","Data":"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0"} Jun 13 05:12:18 crc kubenswrapper[4894]: I0613 05:12:18.961488 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fc19190-399d-4294-9672-423be9b7696f" containerID="e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0" exitCode=0 Jun 13 05:12:18 crc kubenswrapper[4894]: I0613 05:12:18.961548 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerDied","Data":"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0"} Jun 13 05:12:19 crc kubenswrapper[4894]: I0613 05:12:19.975924 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerStarted","Data":"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443"} Jun 13 05:12:20 crc kubenswrapper[4894]: I0613 05:12:20.000936 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z8sw6" podStartSLOduration=3.505462666 podStartE2EDuration="6.000914175s" podCreationTimestamp="2025-06-13 05:12:14 +0000 UTC" firstStartedPulling="2025-06-13 05:12:16.938065391 +0000 UTC m=+1295.384312894" lastFinishedPulling="2025-06-13 05:12:19.43351693 +0000 UTC m=+1297.879764403" observedRunningTime="2025-06-13 05:12:19.998361972 +0000 UTC m=+1298.444609475" watchObservedRunningTime="2025-06-13 05:12:20.000914175 +0000 UTC m=+1298.447161648" Jun 13 05:12:25 crc kubenswrapper[4894]: I0613 05:12:25.228932 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:25 crc kubenswrapper[4894]: I0613 05:12:25.229346 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:25 crc kubenswrapper[4894]: I0613 05:12:25.300582 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.138565 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.221883 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.236160 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.236222 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.236272 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.237013 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:12:26 crc kubenswrapper[4894]: I0613 05:12:26.237076 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf" gracePeriod=600 Jun 13 05:12:27 crc kubenswrapper[4894]: I0613 05:12:27.073431 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf" exitCode=0 Jun 13 05:12:27 crc kubenswrapper[4894]: I0613 05:12:27.073520 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf"} Jun 13 05:12:27 crc kubenswrapper[4894]: I0613 05:12:27.073979 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b"} Jun 13 05:12:27 crc kubenswrapper[4894]: I0613 05:12:27.074015 4894 scope.go:117] "RemoveContainer" containerID="ff8016684c004f232b6a504c33ef795a7218e2d876a546cbde879c8c977497c6" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.086996 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z8sw6" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="registry-server" containerID="cri-o://3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443" gracePeriod=2 Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.588977 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.734618 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content\") pod \"9fc19190-399d-4294-9672-423be9b7696f\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.735002 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h56kq\" (UniqueName: \"kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq\") pod \"9fc19190-399d-4294-9672-423be9b7696f\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.735177 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities\") pod \"9fc19190-399d-4294-9672-423be9b7696f\" (UID: \"9fc19190-399d-4294-9672-423be9b7696f\") " Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.736607 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities" (OuterVolumeSpecName: "utilities") pod "9fc19190-399d-4294-9672-423be9b7696f" (UID: "9fc19190-399d-4294-9672-423be9b7696f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.745069 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq" (OuterVolumeSpecName: "kube-api-access-h56kq") pod "9fc19190-399d-4294-9672-423be9b7696f" (UID: "9fc19190-399d-4294-9672-423be9b7696f"). InnerVolumeSpecName "kube-api-access-h56kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.787852 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fc19190-399d-4294-9672-423be9b7696f" (UID: "9fc19190-399d-4294-9672-423be9b7696f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.837266 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.837297 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h56kq\" (UniqueName: \"kubernetes.io/projected/9fc19190-399d-4294-9672-423be9b7696f-kube-api-access-h56kq\") on node \"crc\" DevicePath \"\"" Jun 13 05:12:28 crc kubenswrapper[4894]: I0613 05:12:28.837312 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc19190-399d-4294-9672-423be9b7696f-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.100937 4894 generic.go:334] "Generic (PLEG): container finished" podID="9fc19190-399d-4294-9672-423be9b7696f" containerID="3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443" exitCode=0 Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.101050 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z8sw6" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.101783 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerDied","Data":"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443"} Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.101842 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z8sw6" event={"ID":"9fc19190-399d-4294-9672-423be9b7696f","Type":"ContainerDied","Data":"9b48a3b5b1f6f9cda841938e6e38c33c17fff6dbe5902b597c4ae7bf33c815ec"} Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.101872 4894 scope.go:117] "RemoveContainer" containerID="3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.137295 4894 scope.go:117] "RemoveContainer" containerID="e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.159278 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.173801 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z8sw6"] Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.178006 4894 scope.go:117] "RemoveContainer" containerID="b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.216911 4894 scope.go:117] "RemoveContainer" containerID="3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443" Jun 13 05:12:29 crc kubenswrapper[4894]: E0613 05:12:29.217560 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443\": container with ID starting with 3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443 not found: ID does not exist" containerID="3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.217604 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443"} err="failed to get container status \"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443\": rpc error: code = NotFound desc = could not find container \"3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443\": container with ID starting with 3e53b82721ee166c5313c5594d69389ba7a21fb9475cf79d68cdcdb0d323f443 not found: ID does not exist" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.217646 4894 scope.go:117] "RemoveContainer" containerID="e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0" Jun 13 05:12:29 crc kubenswrapper[4894]: E0613 05:12:29.218043 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0\": container with ID starting with e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0 not found: ID does not exist" containerID="e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.218125 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0"} err="failed to get container status \"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0\": rpc error: code = NotFound desc = could not find container \"e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0\": container with ID starting with e3d37c6b4404e87faaf5837b9ba25446d3d269434a2ae8d84132dc96c22055f0 not found: ID does not exist" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.218146 4894 scope.go:117] "RemoveContainer" containerID="b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9" Jun 13 05:12:29 crc kubenswrapper[4894]: E0613 05:12:29.218483 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9\": container with ID starting with b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9 not found: ID does not exist" containerID="b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9" Jun 13 05:12:29 crc kubenswrapper[4894]: I0613 05:12:29.218547 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9"} err="failed to get container status \"b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9\": rpc error: code = NotFound desc = could not find container \"b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9\": container with ID starting with b3602521bf1b69a9f35bfb273bbd042e0c5618e166b80c843cfedde8fc6b81b9 not found: ID does not exist" Jun 13 05:12:30 crc kubenswrapper[4894]: I0613 05:12:30.321285 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc19190-399d-4294-9672-423be9b7696f" path="/var/lib/kubelet/pods/9fc19190-399d-4294-9672-423be9b7696f/volumes" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.462682 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:12:46 crc kubenswrapper[4894]: E0613 05:12:46.463700 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="extract-utilities" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.463716 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="extract-utilities" Jun 13 05:12:46 crc kubenswrapper[4894]: E0613 05:12:46.463739 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="extract-content" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.463748 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="extract-content" Jun 13 05:12:46 crc kubenswrapper[4894]: E0613 05:12:46.463775 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="registry-server" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.463783 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="registry-server" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.464022 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc19190-399d-4294-9672-423be9b7696f" containerName="registry-server" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.465837 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.493565 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.648589 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.649263 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2wl\" (UniqueName: \"kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.649472 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.750981 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2wl\" (UniqueName: \"kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.751088 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.751149 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.751572 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.751854 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.783573 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2wl\" (UniqueName: \"kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl\") pod \"redhat-marketplace-98stx\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:46 crc kubenswrapper[4894]: I0613 05:12:46.786326 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:47 crc kubenswrapper[4894]: I0613 05:12:47.095470 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:12:47 crc kubenswrapper[4894]: I0613 05:12:47.323096 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerStarted","Data":"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8"} Jun 13 05:12:47 crc kubenswrapper[4894]: I0613 05:12:47.324453 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerStarted","Data":"69377ab541108aa0de5819f731be7bfc5d342d3d931ba275c5b2d86ca0632723"} Jun 13 05:12:48 crc kubenswrapper[4894]: I0613 05:12:48.335984 4894 generic.go:334] "Generic (PLEG): container finished" podID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerID="5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8" exitCode=0 Jun 13 05:12:48 crc kubenswrapper[4894]: I0613 05:12:48.336081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerDied","Data":"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8"} Jun 13 05:12:49 crc kubenswrapper[4894]: I0613 05:12:49.351137 4894 generic.go:334] "Generic (PLEG): container finished" podID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerID="5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84" exitCode=0 Jun 13 05:12:49 crc kubenswrapper[4894]: I0613 05:12:49.352766 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerDied","Data":"5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84"} Jun 13 05:12:50 crc kubenswrapper[4894]: I0613 05:12:50.373532 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerStarted","Data":"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102"} Jun 13 05:12:56 crc kubenswrapper[4894]: I0613 05:12:56.787173 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:56 crc kubenswrapper[4894]: I0613 05:12:56.787763 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:56 crc kubenswrapper[4894]: I0613 05:12:56.840364 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:56 crc kubenswrapper[4894]: I0613 05:12:56.868281 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-98stx" podStartSLOduration=9.377570529 podStartE2EDuration="10.868256649s" podCreationTimestamp="2025-06-13 05:12:46 +0000 UTC" firstStartedPulling="2025-06-13 05:12:48.341032198 +0000 UTC m=+1326.787279711" lastFinishedPulling="2025-06-13 05:12:49.831718358 +0000 UTC m=+1328.277965831" observedRunningTime="2025-06-13 05:12:50.399881774 +0000 UTC m=+1328.846129237" watchObservedRunningTime="2025-06-13 05:12:56.868256649 +0000 UTC m=+1335.314504132" Jun 13 05:12:57 crc kubenswrapper[4894]: I0613 05:12:57.528859 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:12:57 crc kubenswrapper[4894]: I0613 05:12:57.571535 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:12:59 crc kubenswrapper[4894]: I0613 05:12:59.488030 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-98stx" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="registry-server" containerID="cri-o://e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102" gracePeriod=2 Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.060716 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.255004 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities\") pod \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.255234 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content\") pod \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.255339 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2wl\" (UniqueName: \"kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl\") pod \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\" (UID: \"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e\") " Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.256769 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities" (OuterVolumeSpecName: "utilities") pod "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" (UID: "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.268551 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" (UID: "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.269248 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl" (OuterVolumeSpecName: "kube-api-access-zq2wl") pod "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" (UID: "86e0c2e9-cbbf-4185-8045-02cca1a0aa2e"). InnerVolumeSpecName "kube-api-access-zq2wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.359801 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.359868 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.359902 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2wl\" (UniqueName: \"kubernetes.io/projected/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e-kube-api-access-zq2wl\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.502009 4894 generic.go:334] "Generic (PLEG): container finished" podID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerID="e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102" exitCode=0 Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.502068 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerDied","Data":"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102"} Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.502108 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98stx" event={"ID":"86e0c2e9-cbbf-4185-8045-02cca1a0aa2e","Type":"ContainerDied","Data":"69377ab541108aa0de5819f731be7bfc5d342d3d931ba275c5b2d86ca0632723"} Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.502138 4894 scope.go:117] "RemoveContainer" containerID="e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.502323 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98stx" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.536689 4894 scope.go:117] "RemoveContainer" containerID="5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.537856 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.546521 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-98stx"] Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.579136 4894 scope.go:117] "RemoveContainer" containerID="5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.620118 4894 scope.go:117] "RemoveContainer" containerID="e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102" Jun 13 05:13:00 crc kubenswrapper[4894]: E0613 05:13:00.620713 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102\": container with ID starting with e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102 not found: ID does not exist" containerID="e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.620772 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102"} err="failed to get container status \"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102\": rpc error: code = NotFound desc = could not find container \"e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102\": container with ID starting with e6727c1d3539b2fd8e34cd97b4b74b0549ec252fc55796d289b15cb21e08d102 not found: ID does not exist" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.620799 4894 scope.go:117] "RemoveContainer" containerID="5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84" Jun 13 05:13:00 crc kubenswrapper[4894]: E0613 05:13:00.621435 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84\": container with ID starting with 5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84 not found: ID does not exist" containerID="5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.621761 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84"} err="failed to get container status \"5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84\": rpc error: code = NotFound desc = could not find container \"5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84\": container with ID starting with 5ba3a337ebe126110d666a3f5b99720ff05915bbfaadc4546211012a16530b84 not found: ID does not exist" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.621923 4894 scope.go:117] "RemoveContainer" containerID="5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8" Jun 13 05:13:00 crc kubenswrapper[4894]: E0613 05:13:00.622514 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8\": container with ID starting with 5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8 not found: ID does not exist" containerID="5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8" Jun 13 05:13:00 crc kubenswrapper[4894]: I0613 05:13:00.622560 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8"} err="failed to get container status \"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8\": rpc error: code = NotFound desc = could not find container \"5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8\": container with ID starting with 5e02bc5d61487c90e409366f41e4f67f428a4641eb5980e2aed71be552787db8 not found: ID does not exist" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.055834 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-jvhrh"] Jun 13 05:13:02 crc kubenswrapper[4894]: E0613 05:13:02.057116 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="extract-utilities" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.057141 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="extract-utilities" Jun 13 05:13:02 crc kubenswrapper[4894]: E0613 05:13:02.057190 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="registry-server" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.057203 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="registry-server" Jun 13 05:13:02 crc kubenswrapper[4894]: E0613 05:13:02.057220 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="extract-content" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.057236 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="extract-content" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.057796 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" containerName="registry-server" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.058815 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.062532 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.195455 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.195575 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmztv\" (UniqueName: \"kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.291330 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e0c2e9-cbbf-4185-8045-02cca1a0aa2e" path="/var/lib/kubelet/pods/86e0c2e9-cbbf-4185-8045-02cca1a0aa2e/volumes" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.297449 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.297553 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.297661 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmztv\" (UniqueName: \"kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.319173 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmztv\" (UniqueName: \"kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv\") pod \"crc-debug-jvhrh\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.392901 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvhrh" Jun 13 05:13:02 crc kubenswrapper[4894]: W0613 05:13:02.423135 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b5544da_20b7_4ec3_8f01_a103f1162300.slice/crio-6909ca60dd6c5092cee8f3dc0e5256fb2c0cf84114d5176fb81150d213f1c2f0 WatchSource:0}: Error finding container 6909ca60dd6c5092cee8f3dc0e5256fb2c0cf84114d5176fb81150d213f1c2f0: Status 404 returned error can't find the container with id 6909ca60dd6c5092cee8f3dc0e5256fb2c0cf84114d5176fb81150d213f1c2f0 Jun 13 05:13:02 crc kubenswrapper[4894]: I0613 05:13:02.522017 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jvhrh" event={"ID":"7b5544da-20b7-4ec3-8f01-a103f1162300","Type":"ContainerStarted","Data":"6909ca60dd6c5092cee8f3dc0e5256fb2c0cf84114d5176fb81150d213f1c2f0"} Jun 13 05:13:03 crc kubenswrapper[4894]: I0613 05:13:03.536937 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jvhrh" event={"ID":"7b5544da-20b7-4ec3-8f01-a103f1162300","Type":"ContainerStarted","Data":"c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391"} Jun 13 05:13:03 crc kubenswrapper[4894]: I0613 05:13:03.556714 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-jvhrh" podStartSLOduration=1.55668826 podStartE2EDuration="1.55668826s" podCreationTimestamp="2025-06-13 05:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:13:03.555049424 +0000 UTC m=+1342.001296937" watchObservedRunningTime="2025-06-13 05:13:03.55668826 +0000 UTC m=+1342.002935763" Jun 13 05:13:10 crc kubenswrapper[4894]: I0613 05:13:10.677126 4894 scope.go:117] "RemoveContainer" containerID="f4e31b8d991b91ac265f1d86f3f292a8840f1bcc159aab333d76522cc161bfda" Jun 13 05:13:10 crc kubenswrapper[4894]: I0613 05:13:10.708936 4894 scope.go:117] "RemoveContainer" containerID="8cf832ccdc83dc8f468ced5b0ec2ad3a64f4e279efc56d39d8ddb184fa151697" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.688082 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.689696 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.709673 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.797219 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9cc5\" (UniqueName: \"kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.797295 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.797552 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.899424 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.899554 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9cc5\" (UniqueName: \"kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.899607 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.899981 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.900007 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:11 crc kubenswrapper[4894]: I0613 05:13:11.924624 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9cc5\" (UniqueName: \"kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5\") pod \"redhat-operators-wlv67\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.009361 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.265085 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.635372 4894 generic.go:334] "Generic (PLEG): container finished" podID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerID="092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1" exitCode=0 Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.635421 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerDied","Data":"092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1"} Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.635452 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerStarted","Data":"6fa8688c25bf2c0b34aea6c38086bedc8b6a4a3fe78d2c1b27d7e2db8e57d123"} Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.965523 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-jvhrh"] Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.965755 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-jvhrh" podUID="7b5544da-20b7-4ec3-8f01-a103f1162300" containerName="container-00" containerID="cri-o://c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391" gracePeriod=2 Jun 13 05:13:12 crc kubenswrapper[4894]: I0613 05:13:12.975047 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-jvhrh"] Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.065517 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvhrh" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.236251 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host\") pod \"7b5544da-20b7-4ec3-8f01-a103f1162300\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.236381 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmztv\" (UniqueName: \"kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv\") pod \"7b5544da-20b7-4ec3-8f01-a103f1162300\" (UID: \"7b5544da-20b7-4ec3-8f01-a103f1162300\") " Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.236461 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host" (OuterVolumeSpecName: "host") pod "7b5544da-20b7-4ec3-8f01-a103f1162300" (UID: "7b5544da-20b7-4ec3-8f01-a103f1162300"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.237171 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b5544da-20b7-4ec3-8f01-a103f1162300-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.245851 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv" (OuterVolumeSpecName: "kube-api-access-cmztv") pod "7b5544da-20b7-4ec3-8f01-a103f1162300" (UID: "7b5544da-20b7-4ec3-8f01-a103f1162300"). InnerVolumeSpecName "kube-api-access-cmztv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.348536 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmztv\" (UniqueName: \"kubernetes.io/projected/7b5544da-20b7-4ec3-8f01-a103f1162300-kube-api-access-cmztv\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.653736 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerStarted","Data":"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2"} Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.656491 4894 generic.go:334] "Generic (PLEG): container finished" podID="7b5544da-20b7-4ec3-8f01-a103f1162300" containerID="c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391" exitCode=0 Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.656553 4894 scope.go:117] "RemoveContainer" containerID="c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.656714 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvhrh" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.696387 4894 scope.go:117] "RemoveContainer" containerID="c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391" Jun 13 05:13:13 crc kubenswrapper[4894]: E0613 05:13:13.697076 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391\": container with ID starting with c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391 not found: ID does not exist" containerID="c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391" Jun 13 05:13:13 crc kubenswrapper[4894]: I0613 05:13:13.697116 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391"} err="failed to get container status \"c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391\": rpc error: code = NotFound desc = could not find container \"c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391\": container with ID starting with c3cd541b84e7036f42604e68fa999e801422f080ad7d720b646b239cd93be391 not found: ID does not exist" Jun 13 05:13:14 crc kubenswrapper[4894]: I0613 05:13:14.290429 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5544da-20b7-4ec3-8f01-a103f1162300" path="/var/lib/kubelet/pods/7b5544da-20b7-4ec3-8f01-a103f1162300/volumes" Jun 13 05:13:16 crc kubenswrapper[4894]: I0613 05:13:16.685264 4894 generic.go:334] "Generic (PLEG): container finished" podID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerID="b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2" exitCode=0 Jun 13 05:13:16 crc kubenswrapper[4894]: I0613 05:13:16.685359 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerDied","Data":"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2"} Jun 13 05:13:17 crc kubenswrapper[4894]: I0613 05:13:17.698092 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerStarted","Data":"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc"} Jun 13 05:13:17 crc kubenswrapper[4894]: I0613 05:13:17.731173 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlv67" podStartSLOduration=2.225856062 podStartE2EDuration="6.731150221s" podCreationTimestamp="2025-06-13 05:13:11 +0000 UTC" firstStartedPulling="2025-06-13 05:13:12.637249277 +0000 UTC m=+1351.083496740" lastFinishedPulling="2025-06-13 05:13:17.142543426 +0000 UTC m=+1355.588790899" observedRunningTime="2025-06-13 05:13:17.728828065 +0000 UTC m=+1356.175075558" watchObservedRunningTime="2025-06-13 05:13:17.731150221 +0000 UTC m=+1356.177397694" Jun 13 05:13:21 crc kubenswrapper[4894]: I0613 05:13:21.953485 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:21 crc kubenswrapper[4894]: E0613 05:13:21.954730 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5544da-20b7-4ec3-8f01-a103f1162300" containerName="container-00" Jun 13 05:13:21 crc kubenswrapper[4894]: I0613 05:13:21.954749 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5544da-20b7-4ec3-8f01-a103f1162300" containerName="container-00" Jun 13 05:13:21 crc kubenswrapper[4894]: I0613 05:13:21.955251 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5544da-20b7-4ec3-8f01-a103f1162300" containerName="container-00" Jun 13 05:13:21 crc kubenswrapper[4894]: I0613 05:13:21.958034 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.001781 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.009696 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.009741 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.061893 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.109973 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.110025 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qsm\" (UniqueName: \"kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.110725 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.212846 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.213179 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.213209 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qsm\" (UniqueName: \"kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.213631 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.213814 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.259809 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qsm\" (UniqueName: \"kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm\") pod \"community-operators-p4wk9\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.290348 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.610974 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.747752 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerStarted","Data":"94a5900ee649e6c1b18c54a7aa1ca784038fe04cb7dce8e3c8ad79bd14ef97ba"} Jun 13 05:13:22 crc kubenswrapper[4894]: I0613 05:13:22.822970 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:23 crc kubenswrapper[4894]: I0613 05:13:23.773605 4894 generic.go:334] "Generic (PLEG): container finished" podID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerID="ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640" exitCode=0 Jun 13 05:13:23 crc kubenswrapper[4894]: I0613 05:13:23.775552 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerDied","Data":"ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640"} Jun 13 05:13:24 crc kubenswrapper[4894]: I0613 05:13:24.302995 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:24 crc kubenswrapper[4894]: I0613 05:13:24.788634 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerStarted","Data":"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305"} Jun 13 05:13:25 crc kubenswrapper[4894]: I0613 05:13:25.796949 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlv67" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="registry-server" containerID="cri-o://3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc" gracePeriod=2 Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.741007 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.806418 4894 generic.go:334] "Generic (PLEG): container finished" podID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerID="3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305" exitCode=0 Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.806474 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerDied","Data":"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305"} Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.813082 4894 generic.go:334] "Generic (PLEG): container finished" podID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerID="3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc" exitCode=0 Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.813121 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerDied","Data":"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc"} Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.813147 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlv67" event={"ID":"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79","Type":"ContainerDied","Data":"6fa8688c25bf2c0b34aea6c38086bedc8b6a4a3fe78d2c1b27d7e2db8e57d123"} Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.813163 4894 scope.go:117] "RemoveContainer" containerID="3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.813296 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlv67" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.840510 4894 scope.go:117] "RemoveContainer" containerID="b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.864035 4894 scope.go:117] "RemoveContainer" containerID="092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.908891 4894 scope.go:117] "RemoveContainer" containerID="3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc" Jun 13 05:13:26 crc kubenswrapper[4894]: E0613 05:13:26.909404 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc\": container with ID starting with 3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc not found: ID does not exist" containerID="3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.909558 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc"} err="failed to get container status \"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc\": rpc error: code = NotFound desc = could not find container \"3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc\": container with ID starting with 3e80239f1c5d6ba233b43fdd29873edede9d7bc83e53fc24c5aeb8b6809713cc not found: ID does not exist" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.909677 4894 scope.go:117] "RemoveContainer" containerID="b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2" Jun 13 05:13:26 crc kubenswrapper[4894]: E0613 05:13:26.910228 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2\": container with ID starting with b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2 not found: ID does not exist" containerID="b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.910261 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2"} err="failed to get container status \"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2\": rpc error: code = NotFound desc = could not find container \"b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2\": container with ID starting with b6ce50d4184e3f7a43b11cb3172d9686cd0a1824d1e74933060937b3eb7f28f2 not found: ID does not exist" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.910283 4894 scope.go:117] "RemoveContainer" containerID="092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1" Jun 13 05:13:26 crc kubenswrapper[4894]: E0613 05:13:26.910706 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1\": container with ID starting with 092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1 not found: ID does not exist" containerID="092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.910826 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1"} err="failed to get container status \"092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1\": rpc error: code = NotFound desc = could not find container \"092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1\": container with ID starting with 092fdbbff6b12001a982122708fdd28a90d02772a7e2bb69968f8f1a89a68ff1 not found: ID does not exist" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.924592 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9cc5\" (UniqueName: \"kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5\") pod \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.925175 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content\") pod \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.925308 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities\") pod \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\" (UID: \"4eca2e73-8b2b-40d6-9e6d-a11c21e64e79\") " Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.926196 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities" (OuterVolumeSpecName: "utilities") pod "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" (UID: "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.932899 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5" (OuterVolumeSpecName: "kube-api-access-q9cc5") pod "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" (UID: "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79"). InnerVolumeSpecName "kube-api-access-q9cc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:13:26 crc kubenswrapper[4894]: I0613 05:13:26.996109 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" (UID: "4eca2e73-8b2b-40d6-9e6d-a11c21e64e79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.026694 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.026718 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.026728 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9cc5\" (UniqueName: \"kubernetes.io/projected/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79-kube-api-access-q9cc5\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.147253 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.157817 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlv67"] Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.825536 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerStarted","Data":"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0"} Jun 13 05:13:27 crc kubenswrapper[4894]: I0613 05:13:27.848599 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p4wk9" podStartSLOduration=3.298924373 podStartE2EDuration="6.848552088s" podCreationTimestamp="2025-06-13 05:13:21 +0000 UTC" firstStartedPulling="2025-06-13 05:13:23.77646766 +0000 UTC m=+1362.222715143" lastFinishedPulling="2025-06-13 05:13:27.326095395 +0000 UTC m=+1365.772342858" observedRunningTime="2025-06-13 05:13:27.843632598 +0000 UTC m=+1366.289880071" watchObservedRunningTime="2025-06-13 05:13:27.848552088 +0000 UTC m=+1366.294799591" Jun 13 05:13:28 crc kubenswrapper[4894]: I0613 05:13:28.285887 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" path="/var/lib/kubelet/pods/4eca2e73-8b2b-40d6-9e6d-a11c21e64e79/volumes" Jun 13 05:13:32 crc kubenswrapper[4894]: I0613 05:13:32.291746 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:32 crc kubenswrapper[4894]: I0613 05:13:32.292177 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:32 crc kubenswrapper[4894]: I0613 05:13:32.340919 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:32 crc kubenswrapper[4894]: I0613 05:13:32.914760 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:32 crc kubenswrapper[4894]: I0613 05:13:32.965054 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:34 crc kubenswrapper[4894]: I0613 05:13:34.885169 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p4wk9" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="registry-server" containerID="cri-o://0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0" gracePeriod=2 Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.366042 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.471442 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27qsm\" (UniqueName: \"kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm\") pod \"93ea5fee-6345-4905-9e5e-76c00e22eae6\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.471580 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content\") pod \"93ea5fee-6345-4905-9e5e-76c00e22eae6\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.471608 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities\") pod \"93ea5fee-6345-4905-9e5e-76c00e22eae6\" (UID: \"93ea5fee-6345-4905-9e5e-76c00e22eae6\") " Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.473709 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities" (OuterVolumeSpecName: "utilities") pod "93ea5fee-6345-4905-9e5e-76c00e22eae6" (UID: "93ea5fee-6345-4905-9e5e-76c00e22eae6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.478312 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm" (OuterVolumeSpecName: "kube-api-access-27qsm") pod "93ea5fee-6345-4905-9e5e-76c00e22eae6" (UID: "93ea5fee-6345-4905-9e5e-76c00e22eae6"). InnerVolumeSpecName "kube-api-access-27qsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.521537 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ea5fee-6345-4905-9e5e-76c00e22eae6" (UID: "93ea5fee-6345-4905-9e5e-76c00e22eae6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.573909 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27qsm\" (UniqueName: \"kubernetes.io/projected/93ea5fee-6345-4905-9e5e-76c00e22eae6-kube-api-access-27qsm\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.573945 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.573958 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ea5fee-6345-4905-9e5e-76c00e22eae6-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.892933 4894 generic.go:334] "Generic (PLEG): container finished" podID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerID="0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0" exitCode=0 Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.893105 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerDied","Data":"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0"} Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.894072 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4wk9" event={"ID":"93ea5fee-6345-4905-9e5e-76c00e22eae6","Type":"ContainerDied","Data":"94a5900ee649e6c1b18c54a7aa1ca784038fe04cb7dce8e3c8ad79bd14ef97ba"} Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.893182 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4wk9" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.894154 4894 scope.go:117] "RemoveContainer" containerID="0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.910542 4894 scope.go:117] "RemoveContainer" containerID="3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.948200 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.953582 4894 scope.go:117] "RemoveContainer" containerID="ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.957052 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p4wk9"] Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.978083 4894 scope.go:117] "RemoveContainer" containerID="0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0" Jun 13 05:13:35 crc kubenswrapper[4894]: E0613 05:13:35.978494 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0\": container with ID starting with 0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0 not found: ID does not exist" containerID="0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.978532 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0"} err="failed to get container status \"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0\": rpc error: code = NotFound desc = could not find container \"0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0\": container with ID starting with 0f95fb8e20e8e9b5c85e5addf64b1feb17b861fb80cc9ac5b81459bdfcb5bbb0 not found: ID does not exist" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.978554 4894 scope.go:117] "RemoveContainer" containerID="3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305" Jun 13 05:13:35 crc kubenswrapper[4894]: E0613 05:13:35.978891 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305\": container with ID starting with 3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305 not found: ID does not exist" containerID="3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.978911 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305"} err="failed to get container status \"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305\": rpc error: code = NotFound desc = could not find container \"3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305\": container with ID starting with 3716043952d7eb5257a204af6cced276e67d9efb33b908a3061333aa42170305 not found: ID does not exist" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.978925 4894 scope.go:117] "RemoveContainer" containerID="ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640" Jun 13 05:13:35 crc kubenswrapper[4894]: E0613 05:13:35.979208 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640\": container with ID starting with ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640 not found: ID does not exist" containerID="ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640" Jun 13 05:13:35 crc kubenswrapper[4894]: I0613 05:13:35.979226 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640"} err="failed to get container status \"ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640\": rpc error: code = NotFound desc = could not find container \"ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640\": container with ID starting with ff2c22e125e6d0dd24751abe32977553ee5054a9d31f90481b3e2ee321fee640 not found: ID does not exist" Jun 13 05:13:36 crc kubenswrapper[4894]: I0613 05:13:36.289795 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" path="/var/lib/kubelet/pods/93ea5fee-6345-4905-9e5e-76c00e22eae6/volumes" Jun 13 05:13:41 crc kubenswrapper[4894]: I0613 05:13:41.959090 4894 generic.go:334] "Generic (PLEG): container finished" podID="c05db1ce-0491-40b6-a148-be6b414542bc" containerID="7fec164af3701d73e78da5c63e2a0bd8b9db87d7d559a397fb0fa9fac82c7ca9" exitCode=0 Jun 13 05:13:41 crc kubenswrapper[4894]: I0613 05:13:41.959209 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" event={"ID":"c05db1ce-0491-40b6-a148-be6b414542bc","Type":"ContainerDied","Data":"7fec164af3701d73e78da5c63e2a0bd8b9db87d7d559a397fb0fa9fac82c7ca9"} Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.424274 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.539833 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key\") pod \"c05db1ce-0491-40b6-a148-be6b414542bc\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.540170 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory\") pod \"c05db1ce-0491-40b6-a148-be6b414542bc\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.540942 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svn5z\" (UniqueName: \"kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z\") pod \"c05db1ce-0491-40b6-a148-be6b414542bc\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.541321 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle\") pod \"c05db1ce-0491-40b6-a148-be6b414542bc\" (UID: \"c05db1ce-0491-40b6-a148-be6b414542bc\") " Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.554070 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z" (OuterVolumeSpecName: "kube-api-access-svn5z") pod "c05db1ce-0491-40b6-a148-be6b414542bc" (UID: "c05db1ce-0491-40b6-a148-be6b414542bc"). InnerVolumeSpecName "kube-api-access-svn5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.557891 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c05db1ce-0491-40b6-a148-be6b414542bc" (UID: "c05db1ce-0491-40b6-a148-be6b414542bc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.577330 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory" (OuterVolumeSpecName: "inventory") pod "c05db1ce-0491-40b6-a148-be6b414542bc" (UID: "c05db1ce-0491-40b6-a148-be6b414542bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.588876 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c05db1ce-0491-40b6-a148-be6b414542bc" (UID: "c05db1ce-0491-40b6-a148-be6b414542bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.644725 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svn5z\" (UniqueName: \"kubernetes.io/projected/c05db1ce-0491-40b6-a148-be6b414542bc-kube-api-access-svn5z\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.644753 4894 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.644763 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.644772 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c05db1ce-0491-40b6-a148-be6b414542bc-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.989836 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" event={"ID":"c05db1ce-0491-40b6-a148-be6b414542bc","Type":"ContainerDied","Data":"86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2"} Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.990174 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86dac125ba8fec94b7afd01237bd96ab2a78543d76c1da24a9ae4f22b2dd2aa2" Jun 13 05:13:43 crc kubenswrapper[4894]: I0613 05:13:43.989900 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110024 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2"] Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110594 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110649 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110719 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05db1ce-0491-40b6-a148-be6b414542bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110734 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05db1ce-0491-40b6-a148-be6b414542bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110752 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110764 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110788 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="extract-content" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110802 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="extract-content" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110823 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="extract-utilities" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110836 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="extract-utilities" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110860 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="extract-content" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110872 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="extract-content" Jun 13 05:13:44 crc kubenswrapper[4894]: E0613 05:13:44.110899 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="extract-utilities" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.110911 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="extract-utilities" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.111196 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ea5fee-6345-4905-9e5e-76c00e22eae6" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.111868 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05db1ce-0491-40b6-a148-be6b414542bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.111912 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eca2e73-8b2b-40d6-9e6d-a11c21e64e79" containerName="registry-server" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.112604 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.115595 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.116005 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.116255 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.121957 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.147776 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2"] Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.258060 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.258400 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nrss\" (UniqueName: \"kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.258461 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.360373 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.360816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nrss\" (UniqueName: \"kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.360870 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.367452 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.374593 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.392172 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nrss\" (UniqueName: \"kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:44 crc kubenswrapper[4894]: I0613 05:13:44.431312 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:13:45 crc kubenswrapper[4894]: I0613 05:13:45.025627 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2"] Jun 13 05:13:45 crc kubenswrapper[4894]: I0613 05:13:45.038457 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:13:46 crc kubenswrapper[4894]: I0613 05:13:46.012715 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" event={"ID":"d382942f-e4cf-4fa0-9331-f8dbf464cd2e","Type":"ContainerStarted","Data":"65ef03f03c1f7d50827d9379d9a0df505c65cd4c054cf948d7c3628fb443f688"} Jun 13 05:13:46 crc kubenswrapper[4894]: I0613 05:13:46.013322 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" event={"ID":"d382942f-e4cf-4fa0-9331-f8dbf464cd2e","Type":"ContainerStarted","Data":"a63109d1236056af7a40693a6040f156bdd513ae0cf74864a9d6c124ea80c5bf"} Jun 13 05:13:46 crc kubenswrapper[4894]: I0613 05:13:46.049411 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" podStartSLOduration=1.607435788 podStartE2EDuration="2.049387082s" podCreationTimestamp="2025-06-13 05:13:44 +0000 UTC" firstStartedPulling="2025-06-13 05:13:45.037902982 +0000 UTC m=+1383.484150485" lastFinishedPulling="2025-06-13 05:13:45.479854276 +0000 UTC m=+1383.926101779" observedRunningTime="2025-06-13 05:13:46.034073168 +0000 UTC m=+1384.480320641" watchObservedRunningTime="2025-06-13 05:13:46.049387082 +0000 UTC m=+1384.495634555" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.349837 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-92lzk"] Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.351719 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.353697 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.437204 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.437288 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqd26\" (UniqueName: \"kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.539231 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.539583 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqd26\" (UniqueName: \"kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.539413 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.565951 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqd26\" (UniqueName: \"kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26\") pod \"crc-debug-92lzk\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: I0613 05:14:02.670020 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-92lzk" Jun 13 05:14:02 crc kubenswrapper[4894]: W0613 05:14:02.725992 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e7f6a04_ac58_489d_be49_8ec8cd104f7b.slice/crio-ada00e9e28e1fc74f8d211f1a97ab164c9f8ad44e200a018b0e35978cc73088e WatchSource:0}: Error finding container ada00e9e28e1fc74f8d211f1a97ab164c9f8ad44e200a018b0e35978cc73088e: Status 404 returned error can't find the container with id ada00e9e28e1fc74f8d211f1a97ab164c9f8ad44e200a018b0e35978cc73088e Jun 13 05:14:03 crc kubenswrapper[4894]: I0613 05:14:03.205500 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-92lzk" event={"ID":"9e7f6a04-ac58-489d-be49-8ec8cd104f7b","Type":"ContainerStarted","Data":"a658d0561a64ec217cfeaa54a17405c902f0268d0ff76d1a74b146459df0f976"} Jun 13 05:14:03 crc kubenswrapper[4894]: I0613 05:14:03.205982 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-92lzk" event={"ID":"9e7f6a04-ac58-489d-be49-8ec8cd104f7b","Type":"ContainerStarted","Data":"ada00e9e28e1fc74f8d211f1a97ab164c9f8ad44e200a018b0e35978cc73088e"} Jun 13 05:14:03 crc kubenswrapper[4894]: I0613 05:14:03.226973 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-92lzk" podStartSLOduration=1.2269566219999999 podStartE2EDuration="1.226956622s" podCreationTimestamp="2025-06-13 05:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:14:03.224218425 +0000 UTC m=+1401.670465918" watchObservedRunningTime="2025-06-13 05:14:03.226956622 +0000 UTC m=+1401.673204085" Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.312369 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-92lzk"] Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.313185 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-92lzk" podUID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" containerName="container-00" containerID="cri-o://a658d0561a64ec217cfeaa54a17405c902f0268d0ff76d1a74b146459df0f976" gracePeriod=2 Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.324549 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-92lzk"] Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.336309 4894 generic.go:334] "Generic (PLEG): container finished" podID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" containerID="a658d0561a64ec217cfeaa54a17405c902f0268d0ff76d1a74b146459df0f976" exitCode=0 Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.400924 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-92lzk" Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.507287 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host\") pod \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.507376 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host" (OuterVolumeSpecName: "host") pod "9e7f6a04-ac58-489d-be49-8ec8cd104f7b" (UID: "9e7f6a04-ac58-489d-be49-8ec8cd104f7b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.507397 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqd26\" (UniqueName: \"kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26\") pod \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\" (UID: \"9e7f6a04-ac58-489d-be49-8ec8cd104f7b\") " Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.508275 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.512597 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26" (OuterVolumeSpecName: "kube-api-access-lqd26") pod "9e7f6a04-ac58-489d-be49-8ec8cd104f7b" (UID: "9e7f6a04-ac58-489d-be49-8ec8cd104f7b"). InnerVolumeSpecName "kube-api-access-lqd26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:14:13 crc kubenswrapper[4894]: I0613 05:14:13.609499 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqd26\" (UniqueName: \"kubernetes.io/projected/9e7f6a04-ac58-489d-be49-8ec8cd104f7b-kube-api-access-lqd26\") on node \"crc\" DevicePath \"\"" Jun 13 05:14:14 crc kubenswrapper[4894]: I0613 05:14:14.298553 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" path="/var/lib/kubelet/pods/9e7f6a04-ac58-489d-be49-8ec8cd104f7b/volumes" Jun 13 05:14:14 crc kubenswrapper[4894]: I0613 05:14:14.356816 4894 scope.go:117] "RemoveContainer" containerID="a658d0561a64ec217cfeaa54a17405c902f0268d0ff76d1a74b146459df0f976" Jun 13 05:14:14 crc kubenswrapper[4894]: I0613 05:14:14.356888 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-92lzk" Jun 13 05:14:26 crc kubenswrapper[4894]: I0613 05:14:26.236559 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:14:26 crc kubenswrapper[4894]: I0613 05:14:26.237265 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:14:31 crc kubenswrapper[4894]: I0613 05:14:31.035819 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wnhfr"] Jun 13 05:14:31 crc kubenswrapper[4894]: I0613 05:14:31.041411 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wnhfr"] Jun 13 05:14:32 crc kubenswrapper[4894]: I0613 05:14:32.292184 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7575b610-4439-4ce2-bcc0-1d52e1ab719f" path="/var/lib/kubelet/pods/7575b610-4439-4ce2-bcc0-1d52e1ab719f/volumes" Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.071440 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2trmz"] Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.083384 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wtp2n"] Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.093021 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2trmz"] Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.102042 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wtp2n"] Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.291269 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db94294-9551-4d80-8c50-5ac61b3343bf" path="/var/lib/kubelet/pods/1db94294-9551-4d80-8c50-5ac61b3343bf/volumes" Jun 13 05:14:40 crc kubenswrapper[4894]: I0613 05:14:40.292232 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4" path="/var/lib/kubelet/pods/565bfe30-bbf9-4eb4-b308-c2bcff8ecbf4/volumes" Jun 13 05:14:41 crc kubenswrapper[4894]: I0613 05:14:41.033248 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0696-account-create-28h8v"] Jun 13 05:14:41 crc kubenswrapper[4894]: I0613 05:14:41.042171 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0696-account-create-28h8v"] Jun 13 05:14:42 crc kubenswrapper[4894]: I0613 05:14:42.291382 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee825978-f532-48e8-aeca-7f6a21fd1625" path="/var/lib/kubelet/pods/ee825978-f532-48e8-aeca-7f6a21fd1625/volumes" Jun 13 05:14:56 crc kubenswrapper[4894]: I0613 05:14:56.235936 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:14:56 crc kubenswrapper[4894]: I0613 05:14:56.236526 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:14:57 crc kubenswrapper[4894]: I0613 05:14:57.039957 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c1f1-account-create-xrwwd"] Jun 13 05:14:57 crc kubenswrapper[4894]: I0613 05:14:57.049804 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d6e4-account-create-9tvc8"] Jun 13 05:14:57 crc kubenswrapper[4894]: I0613 05:14:57.062365 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c1f1-account-create-xrwwd"] Jun 13 05:14:57 crc kubenswrapper[4894]: I0613 05:14:57.070370 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d6e4-account-create-9tvc8"] Jun 13 05:14:58 crc kubenswrapper[4894]: I0613 05:14:58.296090 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aba867f-c501-4477-bbc6-9d713f2d2b13" path="/var/lib/kubelet/pods/5aba867f-c501-4477-bbc6-9d713f2d2b13/volumes" Jun 13 05:14:58 crc kubenswrapper[4894]: I0613 05:14:58.297199 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb05c85b-3440-4c78-b64a-9e950da85ed9" path="/var/lib/kubelet/pods/cb05c85b-3440-4c78-b64a-9e950da85ed9/volumes" Jun 13 05:14:59 crc kubenswrapper[4894]: I0613 05:14:59.840780 4894 generic.go:334] "Generic (PLEG): container finished" podID="d382942f-e4cf-4fa0-9331-f8dbf464cd2e" containerID="65ef03f03c1f7d50827d9379d9a0df505c65cd4c054cf948d7c3628fb443f688" exitCode=0 Jun 13 05:14:59 crc kubenswrapper[4894]: I0613 05:14:59.840846 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" event={"ID":"d382942f-e4cf-4fa0-9331-f8dbf464cd2e","Type":"ContainerDied","Data":"65ef03f03c1f7d50827d9379d9a0df505c65cd4c054cf948d7c3628fb443f688"} Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.055966 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-k5mrc"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.066113 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4c6w7"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.076198 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-skf6h"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.085401 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4c6w7"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.091237 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-skf6h"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.096582 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-k5mrc"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.180071 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m"] Jun 13 05:15:00 crc kubenswrapper[4894]: E0613 05:15:00.180477 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" containerName="container-00" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.180490 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" containerName="container-00" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.180689 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7f6a04-ac58-489d-be49-8ec8cd104f7b" containerName="container-00" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.181311 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.187558 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.187972 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.192534 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m"] Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.247016 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.247090 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76bk\" (UniqueName: \"kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.247396 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.288775 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375c390c-32f8-4a62-84df-ce789ec5a118" path="/var/lib/kubelet/pods/375c390c-32f8-4a62-84df-ce789ec5a118/volumes" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.289432 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48af4afd-782d-44df-a045-53a21dc75744" path="/var/lib/kubelet/pods/48af4afd-782d-44df-a045-53a21dc75744/volumes" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.290073 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab77546b-abcf-47cc-88ab-1d11d45d837d" path="/var/lib/kubelet/pods/ab77546b-abcf-47cc-88ab-1d11d45d837d/volumes" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.350037 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.350172 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b76bk\" (UniqueName: \"kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.350318 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.351707 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.360523 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.382737 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76bk\" (UniqueName: \"kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk\") pod \"collect-profiles-29163195-zgm7m\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:00 crc kubenswrapper[4894]: I0613 05:15:00.514463 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.016212 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m"] Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.172748 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.267441 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key\") pod \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.267550 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nrss\" (UniqueName: \"kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss\") pod \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.267647 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory\") pod \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\" (UID: \"d382942f-e4cf-4fa0-9331-f8dbf464cd2e\") " Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.274063 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss" (OuterVolumeSpecName: "kube-api-access-9nrss") pod "d382942f-e4cf-4fa0-9331-f8dbf464cd2e" (UID: "d382942f-e4cf-4fa0-9331-f8dbf464cd2e"). InnerVolumeSpecName "kube-api-access-9nrss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.293379 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d382942f-e4cf-4fa0-9331-f8dbf464cd2e" (UID: "d382942f-e4cf-4fa0-9331-f8dbf464cd2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.294317 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory" (OuterVolumeSpecName: "inventory") pod "d382942f-e4cf-4fa0-9331-f8dbf464cd2e" (UID: "d382942f-e4cf-4fa0-9331-f8dbf464cd2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.372133 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.372183 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nrss\" (UniqueName: \"kubernetes.io/projected/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-kube-api-access-9nrss\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.372204 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d382942f-e4cf-4fa0-9331-f8dbf464cd2e-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.795472 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-z82qd"] Jun 13 05:15:01 crc kubenswrapper[4894]: E0613 05:15:01.796113 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d382942f-e4cf-4fa0-9331-f8dbf464cd2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.796131 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d382942f-e4cf-4fa0-9331-f8dbf464cd2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.796308 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d382942f-e4cf-4fa0-9331-f8dbf464cd2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.796906 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.798940 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.859045 4894 generic.go:334] "Generic (PLEG): container finished" podID="0b28a50d-2d30-4225-aa88-54fbdaf4a48a" containerID="ce660fb0cb608d8b8ded9837fa75c9c01491abf5532441da05fce07c7b0fdb89" exitCode=0 Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.859137 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" event={"ID":"0b28a50d-2d30-4225-aa88-54fbdaf4a48a","Type":"ContainerDied","Data":"ce660fb0cb608d8b8ded9837fa75c9c01491abf5532441da05fce07c7b0fdb89"} Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.859192 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" event={"ID":"0b28a50d-2d30-4225-aa88-54fbdaf4a48a","Type":"ContainerStarted","Data":"d770545dbe23f8b1e39b4f741365b9a225ffea759d1ad9e0e075b14e0636d05b"} Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.861211 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" event={"ID":"d382942f-e4cf-4fa0-9331-f8dbf464cd2e","Type":"ContainerDied","Data":"a63109d1236056af7a40693a6040f156bdd513ae0cf74864a9d6c124ea80c5bf"} Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.861261 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63109d1236056af7a40693a6040f156bdd513ae0cf74864a9d6c124ea80c5bf" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.861343 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.879080 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.879270 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7qb\" (UniqueName: \"kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.943429 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw"] Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.944840 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.949404 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.949838 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.950800 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.951515 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.958696 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw"] Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981151 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981200 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkq6p\" (UniqueName: \"kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981236 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981284 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7qb\" (UniqueName: \"kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.981366 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:01 crc kubenswrapper[4894]: I0613 05:15:01.998115 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7qb\" (UniqueName: \"kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb\") pod \"crc-debug-z82qd\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " pod="openstack/crc-debug-z82qd" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.083039 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.083103 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkq6p\" (UniqueName: \"kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.083190 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.086623 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.087192 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.110452 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-z82qd" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.116618 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkq6p\" (UniqueName: \"kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: W0613 05:15:02.143741 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb2353c_35eb_4ed1_99d9_2939bb165228.slice/crio-8f6fcc79898091efd5db8e0c880739e71d3f60c71797cfd6c4a2cea5004215c9 WatchSource:0}: Error finding container 8f6fcc79898091efd5db8e0c880739e71d3f60c71797cfd6c4a2cea5004215c9: Status 404 returned error can't find the container with id 8f6fcc79898091efd5db8e0c880739e71d3f60c71797cfd6c4a2cea5004215c9 Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.261316 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.781544 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw"] Jun 13 05:15:02 crc kubenswrapper[4894]: W0613 05:15:02.791328 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9451abf4_721b_4ea5_b14e_5ac8f9beb4f5.slice/crio-b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b WatchSource:0}: Error finding container b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b: Status 404 returned error can't find the container with id b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.871388 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-z82qd" event={"ID":"0fb2353c-35eb-4ed1-99d9-2939bb165228","Type":"ContainerStarted","Data":"b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc"} Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.871427 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-z82qd" event={"ID":"0fb2353c-35eb-4ed1-99d9-2939bb165228","Type":"ContainerStarted","Data":"8f6fcc79898091efd5db8e0c880739e71d3f60c71797cfd6c4a2cea5004215c9"} Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.872773 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" event={"ID":"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5","Type":"ContainerStarted","Data":"b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b"} Jun 13 05:15:02 crc kubenswrapper[4894]: I0613 05:15:02.891976 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-z82qd" podStartSLOduration=1.891953774 podStartE2EDuration="1.891953774s" podCreationTimestamp="2025-06-13 05:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:15:02.889984468 +0000 UTC m=+1461.336231941" watchObservedRunningTime="2025-06-13 05:15:02.891953774 +0000 UTC m=+1461.338201247" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.212214 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.309622 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume\") pod \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.309752 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b76bk\" (UniqueName: \"kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk\") pod \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.309828 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume\") pod \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\" (UID: \"0b28a50d-2d30-4225-aa88-54fbdaf4a48a\") " Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.310608 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b28a50d-2d30-4225-aa88-54fbdaf4a48a" (UID: "0b28a50d-2d30-4225-aa88-54fbdaf4a48a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.313809 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b28a50d-2d30-4225-aa88-54fbdaf4a48a" (UID: "0b28a50d-2d30-4225-aa88-54fbdaf4a48a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.314291 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk" (OuterVolumeSpecName: "kube-api-access-b76bk") pod "0b28a50d-2d30-4225-aa88-54fbdaf4a48a" (UID: "0b28a50d-2d30-4225-aa88-54fbdaf4a48a"). InnerVolumeSpecName "kube-api-access-b76bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.411437 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.411736 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b76bk\" (UniqueName: \"kubernetes.io/projected/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-kube-api-access-b76bk\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.411823 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28a50d-2d30-4225-aa88-54fbdaf4a48a-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.895495 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" event={"ID":"0b28a50d-2d30-4225-aa88-54fbdaf4a48a","Type":"ContainerDied","Data":"d770545dbe23f8b1e39b4f741365b9a225ffea759d1ad9e0e075b14e0636d05b"} Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.895804 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d770545dbe23f8b1e39b4f741365b9a225ffea759d1ad9e0e075b14e0636d05b" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.895897 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m" Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.910025 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" event={"ID":"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5","Type":"ContainerStarted","Data":"06f374120fec460d44b20f490960f5eabf2185611be8e43b58038016b3113335"} Jun 13 05:15:03 crc kubenswrapper[4894]: I0613 05:15:03.929845 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" podStartSLOduration=2.485460068 podStartE2EDuration="2.9298273s" podCreationTimestamp="2025-06-13 05:15:01 +0000 UTC" firstStartedPulling="2025-06-13 05:15:02.794575437 +0000 UTC m=+1461.240822920" lastFinishedPulling="2025-06-13 05:15:03.238942689 +0000 UTC m=+1461.685190152" observedRunningTime="2025-06-13 05:15:03.92490565 +0000 UTC m=+1462.371153113" watchObservedRunningTime="2025-06-13 05:15:03.9298273 +0000 UTC m=+1462.376074763" Jun 13 05:15:05 crc kubenswrapper[4894]: I0613 05:15:05.051835 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-t6wgt"] Jun 13 05:15:05 crc kubenswrapper[4894]: I0613 05:15:05.060953 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-t6wgt"] Jun 13 05:15:06 crc kubenswrapper[4894]: I0613 05:15:06.292226 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9908123d-bc70-4017-953a-8f0a082f2726" path="/var/lib/kubelet/pods/9908123d-bc70-4017-953a-8f0a082f2726/volumes" Jun 13 05:15:07 crc kubenswrapper[4894]: I0613 05:15:07.038257 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-843e-account-create-cr568"] Jun 13 05:15:07 crc kubenswrapper[4894]: I0613 05:15:07.052557 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-843e-account-create-cr568"] Jun 13 05:15:08 crc kubenswrapper[4894]: I0613 05:15:08.296929 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519e8c61-c11c-42c9-bb32-c4a454724fe1" path="/var/lib/kubelet/pods/519e8c61-c11c-42c9-bb32-c4a454724fe1/volumes" Jun 13 05:15:08 crc kubenswrapper[4894]: I0613 05:15:08.973594 4894 generic.go:334] "Generic (PLEG): container finished" podID="9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" containerID="06f374120fec460d44b20f490960f5eabf2185611be8e43b58038016b3113335" exitCode=0 Jun 13 05:15:08 crc kubenswrapper[4894]: I0613 05:15:08.973698 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" event={"ID":"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5","Type":"ContainerDied","Data":"06f374120fec460d44b20f490960f5eabf2185611be8e43b58038016b3113335"} Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.411980 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.462542 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkq6p\" (UniqueName: \"kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p\") pod \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.462814 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory\") pod \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.462846 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key\") pod \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\" (UID: \"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5\") " Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.475885 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p" (OuterVolumeSpecName: "kube-api-access-dkq6p") pod "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" (UID: "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5"). InnerVolumeSpecName "kube-api-access-dkq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.489513 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" (UID: "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.489704 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory" (OuterVolumeSpecName: "inventory") pod "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" (UID: "9451abf4-721b-4ea5-b14e-5ac8f9beb4f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.564363 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.564388 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.564397 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkq6p\" (UniqueName: \"kubernetes.io/projected/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5-kube-api-access-dkq6p\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.914047 4894 scope.go:117] "RemoveContainer" containerID="f437271ec3faf6391096e86126d0e358c6b2fbd479ff68f74e82797dcec85c12" Jun 13 05:15:10 crc kubenswrapper[4894]: I0613 05:15:10.965789 4894 scope.go:117] "RemoveContainer" containerID="7d7da288532c226ecceb30f339e777c0228c2294ea36b0db72aabe58eb4a1d49" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.056484 4894 scope.go:117] "RemoveContainer" containerID="e0a1e8d9d2c43aedc8bee2c1ec1db47dec62fe0d6c3acc4d1373ed8fa1991371" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.067314 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" event={"ID":"9451abf4-721b-4ea5-b14e-5ac8f9beb4f5","Type":"ContainerDied","Data":"b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b"} Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.067352 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0dc4364e491bd02f9c433644a6dba4caf1224ae452338dd17454a77efce4f1b" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.067406 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.092402 4894 scope.go:117] "RemoveContainer" containerID="1095cfa46f020d5a1967a1785095cfe5ef6f01d81633d339e2af4faf0ac03a91" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.096577 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7"] Jun 13 05:15:11 crc kubenswrapper[4894]: E0613 05:15:11.097433 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.097459 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:11 crc kubenswrapper[4894]: E0613 05:15:11.097491 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b28a50d-2d30-4225-aa88-54fbdaf4a48a" containerName="collect-profiles" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.097500 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b28a50d-2d30-4225-aa88-54fbdaf4a48a" containerName="collect-profiles" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.097745 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b28a50d-2d30-4225-aa88-54fbdaf4a48a" containerName="collect-profiles" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.097758 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.098587 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.101384 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.101755 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.103244 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.106937 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.106995 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7"] Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.148148 4894 scope.go:117] "RemoveContainer" containerID="2d1d3dd87ffd559c91fe9fc1c601a987a3d5aa7dc866b76ad24953701b8af19c" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.168084 4894 scope.go:117] "RemoveContainer" containerID="1d26a1188086b7a5561b6fcf3240baf9b9cb54329fc1d67c6aca6cbbc409fa38" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.178612 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.178696 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvhfw\" (UniqueName: \"kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.178778 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.194275 4894 scope.go:117] "RemoveContainer" containerID="fca5074640aab5c4d09b03510b18cb4e62c9d0f174cd847b9600ce890d87eaaa" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.212258 4894 scope.go:117] "RemoveContainer" containerID="902880520ce546f18d9dc06e3f3752e078fb2863a89a1462ee1ca5549ca5d14a" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.227769 4894 scope.go:117] "RemoveContainer" containerID="ede9096899e17d4a008948fceb45c19061180e59d18826993e4f8b1771c4f7ec" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.256638 4894 scope.go:117] "RemoveContainer" containerID="06d16fa5d1dd95ed64897321891cc340ad55efc6b8917d268cb45a36b8814bb5" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.280302 4894 scope.go:117] "RemoveContainer" containerID="d0166d765c558ebdb03a2f6737fcb71014fddf7d2b733b9534f80e857a0b9130" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.281089 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.281139 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvhfw\" (UniqueName: \"kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.281198 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.286423 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.286440 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.298917 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvhfw\" (UniqueName: \"kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7kzf7\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.428782 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:15:11 crc kubenswrapper[4894]: I0613 05:15:11.937319 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7"] Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.082038 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" event={"ID":"85491710-8923-482a-bd20-6e82b284a439","Type":"ContainerStarted","Data":"91c5f5f47c99dd1b71d8443085e1b76f5a7a40a8d57c983f2bd4cb346e3cc2ca"} Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.626338 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-z82qd"] Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.629543 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-z82qd" podUID="0fb2353c-35eb-4ed1-99d9-2939bb165228" containerName="container-00" containerID="cri-o://b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc" gracePeriod=2 Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.639375 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-z82qd"] Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.711627 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-z82qd" Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.811141 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host\") pod \"0fb2353c-35eb-4ed1-99d9-2939bb165228\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.811268 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host" (OuterVolumeSpecName: "host") pod "0fb2353c-35eb-4ed1-99d9-2939bb165228" (UID: "0fb2353c-35eb-4ed1-99d9-2939bb165228"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.811368 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7qb\" (UniqueName: \"kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb\") pod \"0fb2353c-35eb-4ed1-99d9-2939bb165228\" (UID: \"0fb2353c-35eb-4ed1-99d9-2939bb165228\") " Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.811806 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb2353c-35eb-4ed1-99d9-2939bb165228-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.815524 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb" (OuterVolumeSpecName: "kube-api-access-gn7qb") pod "0fb2353c-35eb-4ed1-99d9-2939bb165228" (UID: "0fb2353c-35eb-4ed1-99d9-2939bb165228"). InnerVolumeSpecName "kube-api-access-gn7qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:15:12 crc kubenswrapper[4894]: I0613 05:15:12.913553 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7qb\" (UniqueName: \"kubernetes.io/projected/0fb2353c-35eb-4ed1-99d9-2939bb165228-kube-api-access-gn7qb\") on node \"crc\" DevicePath \"\"" Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.096629 4894 generic.go:334] "Generic (PLEG): container finished" podID="0fb2353c-35eb-4ed1-99d9-2939bb165228" containerID="b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc" exitCode=0 Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.097290 4894 scope.go:117] "RemoveContainer" containerID="b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc" Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.097869 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-z82qd" Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.107324 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" event={"ID":"85491710-8923-482a-bd20-6e82b284a439","Type":"ContainerStarted","Data":"1b0f2629782e9f14cb6e94ea221258f07df6750fe6590a262d8105761ba4a41f"} Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.140318 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" podStartSLOduration=1.68762501 podStartE2EDuration="2.140297298s" podCreationTimestamp="2025-06-13 05:15:11 +0000 UTC" firstStartedPulling="2025-06-13 05:15:11.955964344 +0000 UTC m=+1470.402211827" lastFinishedPulling="2025-06-13 05:15:12.408636662 +0000 UTC m=+1470.854884115" observedRunningTime="2025-06-13 05:15:13.130315026 +0000 UTC m=+1471.576562499" watchObservedRunningTime="2025-06-13 05:15:13.140297298 +0000 UTC m=+1471.586544771" Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.147133 4894 scope.go:117] "RemoveContainer" containerID="b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc" Jun 13 05:15:13 crc kubenswrapper[4894]: E0613 05:15:13.147729 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc\": container with ID starting with b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc not found: ID does not exist" containerID="b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc" Jun 13 05:15:13 crc kubenswrapper[4894]: I0613 05:15:13.147775 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc"} err="failed to get container status \"b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc\": rpc error: code = NotFound desc = could not find container \"b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc\": container with ID starting with b072215888cd6b604452cdb788c319f46248e78d02921d071517f420334acacc not found: ID does not exist" Jun 13 05:15:14 crc kubenswrapper[4894]: I0613 05:15:14.296258 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb2353c-35eb-4ed1-99d9-2939bb165228" path="/var/lib/kubelet/pods/0fb2353c-35eb-4ed1-99d9-2939bb165228/volumes" Jun 13 05:15:17 crc kubenswrapper[4894]: I0613 05:15:17.053328 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e141-account-create-sgw88"] Jun 13 05:15:17 crc kubenswrapper[4894]: I0613 05:15:17.066855 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8adb-account-create-sxgkf"] Jun 13 05:15:17 crc kubenswrapper[4894]: I0613 05:15:17.077208 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e141-account-create-sgw88"] Jun 13 05:15:17 crc kubenswrapper[4894]: I0613 05:15:17.091962 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8adb-account-create-sxgkf"] Jun 13 05:15:18 crc kubenswrapper[4894]: I0613 05:15:18.043421 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wzjkd"] Jun 13 05:15:18 crc kubenswrapper[4894]: I0613 05:15:18.055452 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wzjkd"] Jun 13 05:15:18 crc kubenswrapper[4894]: I0613 05:15:18.291952 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff06e15-dbe8-4864-a039-e30cb2cd88d5" path="/var/lib/kubelet/pods/3ff06e15-dbe8-4864-a039-e30cb2cd88d5/volumes" Jun 13 05:15:18 crc kubenswrapper[4894]: I0613 05:15:18.292631 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d173fc-d458-4f68-b2ea-b6b2ed942c5d" path="/var/lib/kubelet/pods/82d173fc-d458-4f68-b2ea-b6b2ed942c5d/volumes" Jun 13 05:15:18 crc kubenswrapper[4894]: I0613 05:15:18.293370 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb658e7d-f920-4362-9ae8-149aaca08cda" path="/var/lib/kubelet/pods/cb658e7d-f920-4362-9ae8-149aaca08cda/volumes" Jun 13 05:15:26 crc kubenswrapper[4894]: I0613 05:15:26.236516 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:15:26 crc kubenswrapper[4894]: I0613 05:15:26.237045 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:15:26 crc kubenswrapper[4894]: I0613 05:15:26.237092 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:15:26 crc kubenswrapper[4894]: I0613 05:15:26.237862 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:15:26 crc kubenswrapper[4894]: I0613 05:15:26.237930 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" gracePeriod=600 Jun 13 05:15:26 crc kubenswrapper[4894]: E0613 05:15:26.365204 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:15:27 crc kubenswrapper[4894]: I0613 05:15:27.280761 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" exitCode=0 Jun 13 05:15:27 crc kubenswrapper[4894]: I0613 05:15:27.280799 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b"} Jun 13 05:15:27 crc kubenswrapper[4894]: I0613 05:15:27.280825 4894 scope.go:117] "RemoveContainer" containerID="ae14d03c47cb9a934d92643fd49aa901579592e5def7191953663603eb9bafdf" Jun 13 05:15:27 crc kubenswrapper[4894]: I0613 05:15:27.281161 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:15:27 crc kubenswrapper[4894]: E0613 05:15:27.281350 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:15:39 crc kubenswrapper[4894]: I0613 05:15:39.276930 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:15:39 crc kubenswrapper[4894]: E0613 05:15:39.277994 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:15:41 crc kubenswrapper[4894]: I0613 05:15:41.056119 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7xc7g"] Jun 13 05:15:41 crc kubenswrapper[4894]: I0613 05:15:41.062293 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7xc7g"] Jun 13 05:15:42 crc kubenswrapper[4894]: I0613 05:15:42.293168 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab9044c-7402-497f-9496-c6ed4aaaa76c" path="/var/lib/kubelet/pods/7ab9044c-7402-497f-9496-c6ed4aaaa76c/volumes" Jun 13 05:15:46 crc kubenswrapper[4894]: I0613 05:15:46.042114 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cgk85"] Jun 13 05:15:46 crc kubenswrapper[4894]: I0613 05:15:46.058759 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cgk85"] Jun 13 05:15:46 crc kubenswrapper[4894]: I0613 05:15:46.291323 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bcc604a-93b6-4aca-bbca-0b078378889d" path="/var/lib/kubelet/pods/6bcc604a-93b6-4aca-bbca-0b078378889d/volumes" Jun 13 05:15:47 crc kubenswrapper[4894]: I0613 05:15:47.041592 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qhq8x"] Jun 13 05:15:47 crc kubenswrapper[4894]: I0613 05:15:47.049296 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qhq8x"] Jun 13 05:15:48 crc kubenswrapper[4894]: I0613 05:15:48.294300 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a28689a-17c0-44ae-b07d-4b23fd1ce70a" path="/var/lib/kubelet/pods/1a28689a-17c0-44ae-b07d-4b23fd1ce70a/volumes" Jun 13 05:15:53 crc kubenswrapper[4894]: I0613 05:15:53.276622 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:15:53 crc kubenswrapper[4894]: E0613 05:15:53.277583 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:15:59 crc kubenswrapper[4894]: I0613 05:15:59.040796 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gq5rk"] Jun 13 05:15:59 crc kubenswrapper[4894]: I0613 05:15:59.050872 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gq5rk"] Jun 13 05:16:00 crc kubenswrapper[4894]: I0613 05:16:00.292136 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcfc55e9-b62b-4d38-8e72-4cf04ba09524" path="/var/lib/kubelet/pods/fcfc55e9-b62b-4d38-8e72-4cf04ba09524/volumes" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.070458 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-h8jx4"] Jun 13 05:16:02 crc kubenswrapper[4894]: E0613 05:16:02.073616 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb2353c-35eb-4ed1-99d9-2939bb165228" containerName="container-00" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.073896 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb2353c-35eb-4ed1-99d9-2939bb165228" containerName="container-00" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.074505 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb2353c-35eb-4ed1-99d9-2939bb165228" containerName="container-00" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.075625 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.079210 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.193140 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.193992 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lrg\" (UniqueName: \"kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.297042 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lrg\" (UniqueName: \"kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.297149 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.297336 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.316992 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lrg\" (UniqueName: \"kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg\") pod \"crc-debug-h8jx4\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.398709 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-h8jx4" Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.683588 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-h8jx4" event={"ID":"7404170e-9775-40ee-8144-f3214cb9257a","Type":"ContainerStarted","Data":"ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131"} Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.684048 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-h8jx4" event={"ID":"7404170e-9775-40ee-8144-f3214cb9257a","Type":"ContainerStarted","Data":"4f67d797f3ae7e919224f3d0bef3f2b0c4b79325073abd29279e026170d81247"} Jun 13 05:16:02 crc kubenswrapper[4894]: I0613 05:16:02.704792 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-h8jx4" podStartSLOduration=0.70477581 podStartE2EDuration="704.77581ms" podCreationTimestamp="2025-06-13 05:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:16:02.701731023 +0000 UTC m=+1521.147978516" watchObservedRunningTime="2025-06-13 05:16:02.70477581 +0000 UTC m=+1521.151023283" Jun 13 05:16:05 crc kubenswrapper[4894]: I0613 05:16:05.276759 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:16:05 crc kubenswrapper[4894]: E0613 05:16:05.277622 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:16:06 crc kubenswrapper[4894]: I0613 05:16:06.728194 4894 generic.go:334] "Generic (PLEG): container finished" podID="85491710-8923-482a-bd20-6e82b284a439" containerID="1b0f2629782e9f14cb6e94ea221258f07df6750fe6590a262d8105761ba4a41f" exitCode=0 Jun 13 05:16:06 crc kubenswrapper[4894]: I0613 05:16:06.728232 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" event={"ID":"85491710-8923-482a-bd20-6e82b284a439","Type":"ContainerDied","Data":"1b0f2629782e9f14cb6e94ea221258f07df6750fe6590a262d8105761ba4a41f"} Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.196351 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.290012 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key\") pod \"85491710-8923-482a-bd20-6e82b284a439\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.290358 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvhfw\" (UniqueName: \"kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw\") pod \"85491710-8923-482a-bd20-6e82b284a439\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.290494 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory\") pod \"85491710-8923-482a-bd20-6e82b284a439\" (UID: \"85491710-8923-482a-bd20-6e82b284a439\") " Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.302607 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw" (OuterVolumeSpecName: "kube-api-access-wvhfw") pod "85491710-8923-482a-bd20-6e82b284a439" (UID: "85491710-8923-482a-bd20-6e82b284a439"). InnerVolumeSpecName "kube-api-access-wvhfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.335085 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory" (OuterVolumeSpecName: "inventory") pod "85491710-8923-482a-bd20-6e82b284a439" (UID: "85491710-8923-482a-bd20-6e82b284a439"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.336399 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85491710-8923-482a-bd20-6e82b284a439" (UID: "85491710-8923-482a-bd20-6e82b284a439"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.393829 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.394014 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvhfw\" (UniqueName: \"kubernetes.io/projected/85491710-8923-482a-bd20-6e82b284a439-kube-api-access-wvhfw\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.394200 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85491710-8923-482a-bd20-6e82b284a439-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.752812 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" event={"ID":"85491710-8923-482a-bd20-6e82b284a439","Type":"ContainerDied","Data":"91c5f5f47c99dd1b71d8443085e1b76f5a7a40a8d57c983f2bd4cb346e3cc2ca"} Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.752858 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c5f5f47c99dd1b71d8443085e1b76f5a7a40a8d57c983f2bd4cb346e3cc2ca" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.752931 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.849306 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn"] Jun 13 05:16:08 crc kubenswrapper[4894]: E0613 05:16:08.849736 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85491710-8923-482a-bd20-6e82b284a439" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.849756 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="85491710-8923-482a-bd20-6e82b284a439" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.849938 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="85491710-8923-482a-bd20-6e82b284a439" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.850512 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.893212 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.894017 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.894289 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.896236 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.901569 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.901743 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.901788 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtcsk\" (UniqueName: \"kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:08 crc kubenswrapper[4894]: I0613 05:16:08.919811 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn"] Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.003185 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtcsk\" (UniqueName: \"kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.003253 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.003389 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.007981 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.009506 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.031081 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtcsk\" (UniqueName: \"kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.047805 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-czlhd"] Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.064800 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-czlhd"] Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.213211 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.580395 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn"] Jun 13 05:16:09 crc kubenswrapper[4894]: I0613 05:16:09.760935 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" event={"ID":"7b0327b0-6896-425c-9e62-d179c465ff04","Type":"ContainerStarted","Data":"12ea1d59e713518b93242648bc603bf5633065a0b76a8e5c2dae7c700187fe05"} Jun 13 05:16:10 crc kubenswrapper[4894]: I0613 05:16:10.287626 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8402629-c5ed-4482-9a1c-bdf5caaa2a21" path="/var/lib/kubelet/pods/e8402629-c5ed-4482-9a1c-bdf5caaa2a21/volumes" Jun 13 05:16:10 crc kubenswrapper[4894]: I0613 05:16:10.771573 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" event={"ID":"7b0327b0-6896-425c-9e62-d179c465ff04","Type":"ContainerStarted","Data":"aa9ac4e1616dc7478149e608f0962016af48800050803d0a9558949bd49d148b"} Jun 13 05:16:10 crc kubenswrapper[4894]: I0613 05:16:10.799600 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" podStartSLOduration=2.34120965 podStartE2EDuration="2.799575398s" podCreationTimestamp="2025-06-13 05:16:08 +0000 UTC" firstStartedPulling="2025-06-13 05:16:09.586989284 +0000 UTC m=+1528.033236747" lastFinishedPulling="2025-06-13 05:16:10.045354992 +0000 UTC m=+1528.491602495" observedRunningTime="2025-06-13 05:16:10.793520236 +0000 UTC m=+1529.239767699" watchObservedRunningTime="2025-06-13 05:16:10.799575398 +0000 UTC m=+1529.245822891" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.511969 4894 scope.go:117] "RemoveContainer" containerID="5f13fe56c8b83a6fb60747b464bc9f7a699a37277a519bb9eaaa16a1c5540bc5" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.552193 4894 scope.go:117] "RemoveContainer" containerID="4de4772a609b7651df82f3f443c5f6187453e7fc6f7e9559ee485579979ba94b" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.616779 4894 scope.go:117] "RemoveContainer" containerID="e9a6e357fe34bfe6ad66fd02bbb1d82712e7ecc6779201673c409dac2488e220" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.664680 4894 scope.go:117] "RemoveContainer" containerID="e1eff5b7391ce617f84516910d42e0cb988c0199d1d6233bdd70f7c622e98682" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.708515 4894 scope.go:117] "RemoveContainer" containerID="7fcb47e81f9e7a6cdc8a98608a8682b05f8a9ef490eb7b9d185032288f09f93c" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.752038 4894 scope.go:117] "RemoveContainer" containerID="4d574624fe567cc280930834dc945e3168634c2b396e195f1264c69d10da2cbd" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.769855 4894 scope.go:117] "RemoveContainer" containerID="2074362eadcf5583824db0302d6442b4eb4183c1ecdce908429422e70357582f" Jun 13 05:16:11 crc kubenswrapper[4894]: I0613 05:16:11.794527 4894 scope.go:117] "RemoveContainer" containerID="7f669537e496e33923176d1259843065243d7a8916ad38c5c3362ab6dc51afbc" Jun 13 05:16:12 crc kubenswrapper[4894]: I0613 05:16:12.947368 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-h8jx4"] Jun 13 05:16:12 crc kubenswrapper[4894]: I0613 05:16:12.947940 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-h8jx4" podUID="7404170e-9775-40ee-8144-f3214cb9257a" containerName="container-00" containerID="cri-o://ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131" gracePeriod=2 Jun 13 05:16:12 crc kubenswrapper[4894]: I0613 05:16:12.960489 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-h8jx4"] Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.039140 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-h8jx4" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.187716 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host\") pod \"7404170e-9775-40ee-8144-f3214cb9257a\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.187807 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5lrg\" (UniqueName: \"kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg\") pod \"7404170e-9775-40ee-8144-f3214cb9257a\" (UID: \"7404170e-9775-40ee-8144-f3214cb9257a\") " Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.188243 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host" (OuterVolumeSpecName: "host") pod "7404170e-9775-40ee-8144-f3214cb9257a" (UID: "7404170e-9775-40ee-8144-f3214cb9257a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.192970 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg" (OuterVolumeSpecName: "kube-api-access-z5lrg") pod "7404170e-9775-40ee-8144-f3214cb9257a" (UID: "7404170e-9775-40ee-8144-f3214cb9257a"). InnerVolumeSpecName "kube-api-access-z5lrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.289263 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7404170e-9775-40ee-8144-f3214cb9257a-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.289292 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5lrg\" (UniqueName: \"kubernetes.io/projected/7404170e-9775-40ee-8144-f3214cb9257a-kube-api-access-z5lrg\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.807960 4894 generic.go:334] "Generic (PLEG): container finished" podID="7404170e-9775-40ee-8144-f3214cb9257a" containerID="ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131" exitCode=0 Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.808006 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-h8jx4" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.808038 4894 scope.go:117] "RemoveContainer" containerID="ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.849353 4894 scope.go:117] "RemoveContainer" containerID="ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131" Jun 13 05:16:13 crc kubenswrapper[4894]: E0613 05:16:13.852028 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131\": container with ID starting with ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131 not found: ID does not exist" containerID="ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131" Jun 13 05:16:13 crc kubenswrapper[4894]: I0613 05:16:13.852092 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131"} err="failed to get container status \"ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131\": rpc error: code = NotFound desc = could not find container \"ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131\": container with ID starting with ed8768b395b960959caf1c39f495ed954eeb9282285b6be5f21993419954a131 not found: ID does not exist" Jun 13 05:16:14 crc kubenswrapper[4894]: I0613 05:16:14.287700 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7404170e-9775-40ee-8144-f3214cb9257a" path="/var/lib/kubelet/pods/7404170e-9775-40ee-8144-f3214cb9257a/volumes" Jun 13 05:16:15 crc kubenswrapper[4894]: I0613 05:16:15.834484 4894 generic.go:334] "Generic (PLEG): container finished" podID="7b0327b0-6896-425c-9e62-d179c465ff04" containerID="aa9ac4e1616dc7478149e608f0962016af48800050803d0a9558949bd49d148b" exitCode=0 Jun 13 05:16:15 crc kubenswrapper[4894]: I0613 05:16:15.834585 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" event={"ID":"7b0327b0-6896-425c-9e62-d179c465ff04","Type":"ContainerDied","Data":"aa9ac4e1616dc7478149e608f0962016af48800050803d0a9558949bd49d148b"} Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.270476 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.276272 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:16:17 crc kubenswrapper[4894]: E0613 05:16:17.276576 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.376761 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory\") pod \"7b0327b0-6896-425c-9e62-d179c465ff04\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.376970 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key\") pod \"7b0327b0-6896-425c-9e62-d179c465ff04\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.377035 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtcsk\" (UniqueName: \"kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk\") pod \"7b0327b0-6896-425c-9e62-d179c465ff04\" (UID: \"7b0327b0-6896-425c-9e62-d179c465ff04\") " Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.382418 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk" (OuterVolumeSpecName: "kube-api-access-mtcsk") pod "7b0327b0-6896-425c-9e62-d179c465ff04" (UID: "7b0327b0-6896-425c-9e62-d179c465ff04"). InnerVolumeSpecName "kube-api-access-mtcsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.405796 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory" (OuterVolumeSpecName: "inventory") pod "7b0327b0-6896-425c-9e62-d179c465ff04" (UID: "7b0327b0-6896-425c-9e62-d179c465ff04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.406991 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b0327b0-6896-425c-9e62-d179c465ff04" (UID: "7b0327b0-6896-425c-9e62-d179c465ff04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.479701 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtcsk\" (UniqueName: \"kubernetes.io/projected/7b0327b0-6896-425c-9e62-d179c465ff04-kube-api-access-mtcsk\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.479729 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.479740 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b0327b0-6896-425c-9e62-d179c465ff04-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.861634 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" event={"ID":"7b0327b0-6896-425c-9e62-d179c465ff04","Type":"ContainerDied","Data":"12ea1d59e713518b93242648bc603bf5633065a0b76a8e5c2dae7c700187fe05"} Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.861718 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ea1d59e713518b93242648bc603bf5633065a0b76a8e5c2dae7c700187fe05" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.861726 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.961829 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424"] Jun 13 05:16:17 crc kubenswrapper[4894]: E0613 05:16:17.962211 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7404170e-9775-40ee-8144-f3214cb9257a" containerName="container-00" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.962227 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7404170e-9775-40ee-8144-f3214cb9257a" containerName="container-00" Jun 13 05:16:17 crc kubenswrapper[4894]: E0613 05:16:17.962260 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0327b0-6896-425c-9e62-d179c465ff04" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.962268 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0327b0-6896-425c-9e62-d179c465ff04" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.962438 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7404170e-9775-40ee-8144-f3214cb9257a" containerName="container-00" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.962454 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0327b0-6896-425c-9e62-d179c465ff04" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.963124 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.969916 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.975852 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.979349 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.983877 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:16:17 crc kubenswrapper[4894]: I0613 05:16:17.985746 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424"] Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.095989 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf7d4\" (UniqueName: \"kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.096050 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.096132 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.197847 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.197982 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf7d4\" (UniqueName: \"kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.198007 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.202821 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.216649 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.219313 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf7d4\" (UniqueName: \"kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rm424\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.285391 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:16:18 crc kubenswrapper[4894]: I0613 05:16:18.880472 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424"] Jun 13 05:16:19 crc kubenswrapper[4894]: I0613 05:16:19.896018 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" event={"ID":"2464fbe8-feb0-4aa6-8985-da3207358c52","Type":"ContainerStarted","Data":"84e911935c044276d3428d53206576eea5aad9005bcd4369306125b4316ec46c"} Jun 13 05:16:19 crc kubenswrapper[4894]: I0613 05:16:19.897551 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" event={"ID":"2464fbe8-feb0-4aa6-8985-da3207358c52","Type":"ContainerStarted","Data":"bc464de6df0cfec7a14551ab81684fadfc2b07c3705b14d66a11596bc356a5de"} Jun 13 05:16:19 crc kubenswrapper[4894]: I0613 05:16:19.914638 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" podStartSLOduration=2.42642192 podStartE2EDuration="2.914615863s" podCreationTimestamp="2025-06-13 05:16:17 +0000 UTC" firstStartedPulling="2025-06-13 05:16:18.890223068 +0000 UTC m=+1537.336470531" lastFinishedPulling="2025-06-13 05:16:19.378416971 +0000 UTC m=+1537.824664474" observedRunningTime="2025-06-13 05:16:19.913559093 +0000 UTC m=+1538.359806586" watchObservedRunningTime="2025-06-13 05:16:19.914615863 +0000 UTC m=+1538.360863366" Jun 13 05:16:29 crc kubenswrapper[4894]: I0613 05:16:29.276456 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:16:29 crc kubenswrapper[4894]: E0613 05:16:29.278196 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:16:38 crc kubenswrapper[4894]: I0613 05:16:38.047899 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hsnzt"] Jun 13 05:16:38 crc kubenswrapper[4894]: I0613 05:16:38.058392 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hsnzt"] Jun 13 05:16:38 crc kubenswrapper[4894]: I0613 05:16:38.286709 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d33205b-2c37-4b71-900b-a8e83762b63f" path="/var/lib/kubelet/pods/9d33205b-2c37-4b71-900b-a8e83762b63f/volumes" Jun 13 05:16:39 crc kubenswrapper[4894]: I0613 05:16:39.040318 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-sgg28"] Jun 13 05:16:39 crc kubenswrapper[4894]: I0613 05:16:39.052157 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t9lbd"] Jun 13 05:16:39 crc kubenswrapper[4894]: I0613 05:16:39.060838 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-sgg28"] Jun 13 05:16:39 crc kubenswrapper[4894]: I0613 05:16:39.068560 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t9lbd"] Jun 13 05:16:40 crc kubenswrapper[4894]: I0613 05:16:40.277421 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:16:40 crc kubenswrapper[4894]: E0613 05:16:40.277980 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:16:40 crc kubenswrapper[4894]: I0613 05:16:40.296626 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb429d3-d365-4dd2-9a8c-680247079215" path="/var/lib/kubelet/pods/3eb429d3-d365-4dd2-9a8c-680247079215/volumes" Jun 13 05:16:40 crc kubenswrapper[4894]: I0613 05:16:40.297494 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e952982d-eac0-4ff8-8817-551eed327bed" path="/var/lib/kubelet/pods/e952982d-eac0-4ff8-8817-551eed327bed/volumes" Jun 13 05:16:53 crc kubenswrapper[4894]: I0613 05:16:53.277958 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:16:53 crc kubenswrapper[4894]: E0613 05:16:53.278915 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.045451 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7c5d-account-create-rnzd5"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.054881 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c7ad-account-create-5vctm"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.064800 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-edbe-account-create-jzgzz"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.072751 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c7ad-account-create-5vctm"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.080473 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7c5d-account-create-rnzd5"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.086151 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-edbe-account-create-jzgzz"] Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.293215 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28eb4b39-d8e5-4dde-ae16-931e82a524d6" path="/var/lib/kubelet/pods/28eb4b39-d8e5-4dde-ae16-931e82a524d6/volumes" Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.294732 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bbe296-097f-4020-bd12-476dfc968482" path="/var/lib/kubelet/pods/81bbe296-097f-4020-bd12-476dfc968482/volumes" Jun 13 05:16:56 crc kubenswrapper[4894]: I0613 05:16:56.295832 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa" path="/var/lib/kubelet/pods/fc63adf6-ddfa-4aa7-b47b-06a7c1f184fa/volumes" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.405854 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-xvnw4"] Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.408305 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.412401 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.454297 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72jh\" (UniqueName: \"kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.454386 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.555794 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g72jh\" (UniqueName: \"kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.555890 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.556077 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.579958 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72jh\" (UniqueName: \"kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh\") pod \"crc-debug-xvnw4\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " pod="openstack/crc-debug-xvnw4" Jun 13 05:17:02 crc kubenswrapper[4894]: I0613 05:17:02.730015 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xvnw4" Jun 13 05:17:03 crc kubenswrapper[4894]: I0613 05:17:03.335077 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xvnw4" event={"ID":"5a79612c-d83a-4536-85c2-d2a7a97ae353","Type":"ContainerStarted","Data":"24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831"} Jun 13 05:17:03 crc kubenswrapper[4894]: I0613 05:17:03.335345 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xvnw4" event={"ID":"5a79612c-d83a-4536-85c2-d2a7a97ae353","Type":"ContainerStarted","Data":"2ea00aa5070103231792518881aa312a5845ad55507682d3083e42d51fff8ef5"} Jun 13 05:17:03 crc kubenswrapper[4894]: I0613 05:17:03.364004 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-xvnw4" podStartSLOduration=1.363988122 podStartE2EDuration="1.363988122s" podCreationTimestamp="2025-06-13 05:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:17:03.357499058 +0000 UTC m=+1581.803746521" watchObservedRunningTime="2025-06-13 05:17:03.363988122 +0000 UTC m=+1581.810235595" Jun 13 05:17:07 crc kubenswrapper[4894]: I0613 05:17:07.278189 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:17:07 crc kubenswrapper[4894]: E0613 05:17:07.278955 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:17:11 crc kubenswrapper[4894]: I0613 05:17:11.958311 4894 scope.go:117] "RemoveContainer" containerID="c6d46d7984c702bd799dff7331a0b0c8020ddae459044cce2084d76338fe1bad" Jun 13 05:17:11 crc kubenswrapper[4894]: I0613 05:17:11.987812 4894 scope.go:117] "RemoveContainer" containerID="ec261e4aba0a91e1b40ef5fcb71e22a63b7e01095b2ed40b51fa369bbbf4fec5" Jun 13 05:17:12 crc kubenswrapper[4894]: I0613 05:17:12.038314 4894 scope.go:117] "RemoveContainer" containerID="4beda1607aaafccd72fa80f4b61226123ba8d80f05e896cef806cbe77d0cc0e8" Jun 13 05:17:12 crc kubenswrapper[4894]: I0613 05:17:12.066162 4894 scope.go:117] "RemoveContainer" containerID="f37dadb62cd3fffdd7fbe1898bf07bfb37f8de28814fe1919c14cecf93e655c3" Jun 13 05:17:12 crc kubenswrapper[4894]: I0613 05:17:12.099087 4894 scope.go:117] "RemoveContainer" containerID="b7501545ceeb8b2797eb76f0fbad6c45682cf63db86e6e58d401dc5368c14243" Jun 13 05:17:12 crc kubenswrapper[4894]: I0613 05:17:12.131187 4894 scope.go:117] "RemoveContainer" containerID="aafb0254c609ddb94ce465f092dced91a23b4fd418ddc153f1988d06bc4ce1ee" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.307971 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-xvnw4"] Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.308251 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-xvnw4" podUID="5a79612c-d83a-4536-85c2-d2a7a97ae353" containerName="container-00" containerID="cri-o://24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831" gracePeriod=2 Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.317838 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-xvnw4"] Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.410858 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xvnw4" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.421860 4894 generic.go:334] "Generic (PLEG): container finished" podID="5a79612c-d83a-4536-85c2-d2a7a97ae353" containerID="24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831" exitCode=0 Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.421908 4894 scope.go:117] "RemoveContainer" containerID="24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.421929 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xvnw4" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.439407 4894 scope.go:117] "RemoveContainer" containerID="24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831" Jun 13 05:17:13 crc kubenswrapper[4894]: E0613 05:17:13.439888 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831\": container with ID starting with 24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831 not found: ID does not exist" containerID="24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.439919 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831"} err="failed to get container status \"24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831\": rpc error: code = NotFound desc = could not find container \"24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831\": container with ID starting with 24c7501b2c2dacbbef904f1913f1bd9fe92acf3eaafdcf405968aa345f843831 not found: ID does not exist" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.476584 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g72jh\" (UniqueName: \"kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh\") pod \"5a79612c-d83a-4536-85c2-d2a7a97ae353\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.476709 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host\") pod \"5a79612c-d83a-4536-85c2-d2a7a97ae353\" (UID: \"5a79612c-d83a-4536-85c2-d2a7a97ae353\") " Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.476774 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host" (OuterVolumeSpecName: "host") pod "5a79612c-d83a-4536-85c2-d2a7a97ae353" (UID: "5a79612c-d83a-4536-85c2-d2a7a97ae353"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.477077 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a79612c-d83a-4536-85c2-d2a7a97ae353-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.482231 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh" (OuterVolumeSpecName: "kube-api-access-g72jh") pod "5a79612c-d83a-4536-85c2-d2a7a97ae353" (UID: "5a79612c-d83a-4536-85c2-d2a7a97ae353"). InnerVolumeSpecName "kube-api-access-g72jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:17:13 crc kubenswrapper[4894]: I0613 05:17:13.578669 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g72jh\" (UniqueName: \"kubernetes.io/projected/5a79612c-d83a-4536-85c2-d2a7a97ae353-kube-api-access-g72jh\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:14 crc kubenswrapper[4894]: I0613 05:17:14.285497 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a79612c-d83a-4536-85c2-d2a7a97ae353" path="/var/lib/kubelet/pods/5a79612c-d83a-4536-85c2-d2a7a97ae353/volumes" Jun 13 05:17:19 crc kubenswrapper[4894]: I0613 05:17:19.277163 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:17:19 crc kubenswrapper[4894]: E0613 05:17:19.277938 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:17:21 crc kubenswrapper[4894]: I0613 05:17:21.049788 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gb6nq"] Jun 13 05:17:21 crc kubenswrapper[4894]: I0613 05:17:21.067138 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gb6nq"] Jun 13 05:17:22 crc kubenswrapper[4894]: I0613 05:17:22.290368 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d3ed6c-aff6-4d8c-a6c6-05341fb1a048" path="/var/lib/kubelet/pods/49d3ed6c-aff6-4d8c-a6c6-05341fb1a048/volumes" Jun 13 05:17:26 crc kubenswrapper[4894]: I0613 05:17:26.539326 4894 generic.go:334] "Generic (PLEG): container finished" podID="2464fbe8-feb0-4aa6-8985-da3207358c52" containerID="84e911935c044276d3428d53206576eea5aad9005bcd4369306125b4316ec46c" exitCode=0 Jun 13 05:17:26 crc kubenswrapper[4894]: I0613 05:17:26.539398 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" event={"ID":"2464fbe8-feb0-4aa6-8985-da3207358c52","Type":"ContainerDied","Data":"84e911935c044276d3428d53206576eea5aad9005bcd4369306125b4316ec46c"} Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.058420 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.171970 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key\") pod \"2464fbe8-feb0-4aa6-8985-da3207358c52\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.172079 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf7d4\" (UniqueName: \"kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4\") pod \"2464fbe8-feb0-4aa6-8985-da3207358c52\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.172262 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory\") pod \"2464fbe8-feb0-4aa6-8985-da3207358c52\" (UID: \"2464fbe8-feb0-4aa6-8985-da3207358c52\") " Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.177275 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4" (OuterVolumeSpecName: "kube-api-access-tf7d4") pod "2464fbe8-feb0-4aa6-8985-da3207358c52" (UID: "2464fbe8-feb0-4aa6-8985-da3207358c52"). InnerVolumeSpecName "kube-api-access-tf7d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.204560 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2464fbe8-feb0-4aa6-8985-da3207358c52" (UID: "2464fbe8-feb0-4aa6-8985-da3207358c52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.206078 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory" (OuterVolumeSpecName: "inventory") pod "2464fbe8-feb0-4aa6-8985-da3207358c52" (UID: "2464fbe8-feb0-4aa6-8985-da3207358c52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.274737 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf7d4\" (UniqueName: \"kubernetes.io/projected/2464fbe8-feb0-4aa6-8985-da3207358c52-kube-api-access-tf7d4\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.274764 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.274774 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2464fbe8-feb0-4aa6-8985-da3207358c52-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.564401 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" event={"ID":"2464fbe8-feb0-4aa6-8985-da3207358c52","Type":"ContainerDied","Data":"bc464de6df0cfec7a14551ab81684fadfc2b07c3705b14d66a11596bc356a5de"} Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.564459 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc464de6df0cfec7a14551ab81684fadfc2b07c3705b14d66a11596bc356a5de" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.564514 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.659525 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8nbmx"] Jun 13 05:17:28 crc kubenswrapper[4894]: E0613 05:17:28.659834 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2464fbe8-feb0-4aa6-8985-da3207358c52" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.659850 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2464fbe8-feb0-4aa6-8985-da3207358c52" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:17:28 crc kubenswrapper[4894]: E0613 05:17:28.659873 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a79612c-d83a-4536-85c2-d2a7a97ae353" containerName="container-00" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.659880 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a79612c-d83a-4536-85c2-d2a7a97ae353" containerName="container-00" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.660053 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2464fbe8-feb0-4aa6-8985-da3207358c52" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.660072 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a79612c-d83a-4536-85c2-d2a7a97ae353" containerName="container-00" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.661496 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.664709 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.666202 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.666574 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.666591 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.682858 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8nbmx"] Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.788820 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrwx\" (UniqueName: \"kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.788905 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.788945 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.890578 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.890674 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.890848 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrwx\" (UniqueName: \"kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.895962 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.909363 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.944264 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrwx\" (UniqueName: \"kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx\") pod \"ssh-known-hosts-edpm-deployment-8nbmx\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:28 crc kubenswrapper[4894]: I0613 05:17:28.975557 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:29 crc kubenswrapper[4894]: I0613 05:17:29.638773 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8nbmx"] Jun 13 05:17:30 crc kubenswrapper[4894]: I0613 05:17:30.586933 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" event={"ID":"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf","Type":"ContainerStarted","Data":"b0a5a0a6443e76bf98e6f1686942f702b8bf2ef709847d4c328d6acb8ed9b96d"} Jun 13 05:17:30 crc kubenswrapper[4894]: I0613 05:17:30.587242 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" event={"ID":"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf","Type":"ContainerStarted","Data":"285ea16f4bcabb9709f960b2cb1b5cfafa00cc8e95d415a10e52d62a41f98758"} Jun 13 05:17:30 crc kubenswrapper[4894]: I0613 05:17:30.616368 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" podStartSLOduration=2.1199901 podStartE2EDuration="2.616346843s" podCreationTimestamp="2025-06-13 05:17:28 +0000 UTC" firstStartedPulling="2025-06-13 05:17:29.64992768 +0000 UTC m=+1608.096175153" lastFinishedPulling="2025-06-13 05:17:30.146284403 +0000 UTC m=+1608.592531896" observedRunningTime="2025-06-13 05:17:30.613892113 +0000 UTC m=+1609.060139616" watchObservedRunningTime="2025-06-13 05:17:30.616346843 +0000 UTC m=+1609.062594316" Jun 13 05:17:33 crc kubenswrapper[4894]: I0613 05:17:33.277135 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:17:33 crc kubenswrapper[4894]: E0613 05:17:33.277619 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:17:38 crc kubenswrapper[4894]: I0613 05:17:38.676341 4894 generic.go:334] "Generic (PLEG): container finished" podID="3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" containerID="b0a5a0a6443e76bf98e6f1686942f702b8bf2ef709847d4c328d6acb8ed9b96d" exitCode=0 Jun 13 05:17:38 crc kubenswrapper[4894]: I0613 05:17:38.676443 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" event={"ID":"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf","Type":"ContainerDied","Data":"b0a5a0a6443e76bf98e6f1686942f702b8bf2ef709847d4c328d6acb8ed9b96d"} Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.139392 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.307741 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam\") pod \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.307836 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfrwx\" (UniqueName: \"kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx\") pod \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.308068 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0\") pod \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\" (UID: \"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf\") " Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.327007 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx" (OuterVolumeSpecName: "kube-api-access-cfrwx") pod "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" (UID: "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf"). InnerVolumeSpecName "kube-api-access-cfrwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.351325 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" (UID: "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.355650 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" (UID: "3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.410051 4894 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-inventory-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.410078 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.410090 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfrwx\" (UniqueName: \"kubernetes.io/projected/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf-kube-api-access-cfrwx\") on node \"crc\" DevicePath \"\"" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.700627 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" event={"ID":"3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf","Type":"ContainerDied","Data":"285ea16f4bcabb9709f960b2cb1b5cfafa00cc8e95d415a10e52d62a41f98758"} Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.700709 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285ea16f4bcabb9709f960b2cb1b5cfafa00cc8e95d415a10e52d62a41f98758" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.700760 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8nbmx" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.819557 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp"] Jun 13 05:17:40 crc kubenswrapper[4894]: E0613 05:17:40.820039 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.820066 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.820405 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.821337 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.824598 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.824914 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.825202 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.825335 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.839724 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp"] Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.922080 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.922228 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj2z6\" (UniqueName: \"kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:40 crc kubenswrapper[4894]: I0613 05:17:40.922425 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.023492 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.023561 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.023611 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj2z6\" (UniqueName: \"kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.028044 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.028081 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.040528 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj2z6\" (UniqueName: \"kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4b5gp\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.137451 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.672032 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp"] Jun 13 05:17:41 crc kubenswrapper[4894]: I0613 05:17:41.709950 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" event={"ID":"44d8727f-e986-41ac-abb1-f33e1b395d03","Type":"ContainerStarted","Data":"c2b6834933e7fa725d4cf1987f2addcfcc5bbbccf537a182e9356c3b37144240"} Jun 13 05:17:42 crc kubenswrapper[4894]: I0613 05:17:42.372567 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:17:42 crc kubenswrapper[4894]: I0613 05:17:42.722467 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" event={"ID":"44d8727f-e986-41ac-abb1-f33e1b395d03","Type":"ContainerStarted","Data":"2fc71543fea0c87f184bf2b24d3f67b589d1a15b69ccafaa720416738d5821c1"} Jun 13 05:17:42 crc kubenswrapper[4894]: I0613 05:17:42.764622 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" podStartSLOduration=2.073424027 podStartE2EDuration="2.764584123s" podCreationTimestamp="2025-06-13 05:17:40 +0000 UTC" firstStartedPulling="2025-06-13 05:17:41.679264768 +0000 UTC m=+1620.125512241" lastFinishedPulling="2025-06-13 05:17:42.370424864 +0000 UTC m=+1620.816672337" observedRunningTime="2025-06-13 05:17:42.75106726 +0000 UTC m=+1621.197314723" watchObservedRunningTime="2025-06-13 05:17:42.764584123 +0000 UTC m=+1621.210831596" Jun 13 05:17:43 crc kubenswrapper[4894]: I0613 05:17:43.058076 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pcx8"] Jun 13 05:17:43 crc kubenswrapper[4894]: I0613 05:17:43.075877 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pcx8"] Jun 13 05:17:44 crc kubenswrapper[4894]: I0613 05:17:44.037749 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5lt6d"] Jun 13 05:17:44 crc kubenswrapper[4894]: I0613 05:17:44.046154 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5lt6d"] Jun 13 05:17:44 crc kubenswrapper[4894]: I0613 05:17:44.279571 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:17:44 crc kubenswrapper[4894]: E0613 05:17:44.280442 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:17:44 crc kubenswrapper[4894]: I0613 05:17:44.296699 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e" path="/var/lib/kubelet/pods/ea1a30de-5a55-4a1d-b7a9-42fc2e25e49e/volumes" Jun 13 05:17:44 crc kubenswrapper[4894]: I0613 05:17:44.298170 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a969ac-8f70-4cd4-bf65-75a10c7f14e9" path="/var/lib/kubelet/pods/f7a969ac-8f70-4cd4-bf65-75a10c7f14e9/volumes" Jun 13 05:17:57 crc kubenswrapper[4894]: I0613 05:17:57.276588 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:17:57 crc kubenswrapper[4894]: E0613 05:17:57.277637 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.733233 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-8zw5m"] Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.755329 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.764140 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.849291 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj5lv\" (UniqueName: \"kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.849368 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.952052 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj5lv\" (UniqueName: \"kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.952315 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.952440 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:01 crc kubenswrapper[4894]: I0613 05:18:01.987604 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj5lv\" (UniqueName: \"kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv\") pod \"crc-debug-8zw5m\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " pod="openstack/crc-debug-8zw5m" Jun 13 05:18:02 crc kubenswrapper[4894]: I0613 05:18:02.081255 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8zw5m" Jun 13 05:18:02 crc kubenswrapper[4894]: I0613 05:18:02.926376 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-8zw5m" event={"ID":"da3e072e-d199-4a17-aa3a-b9c11c049e31","Type":"ContainerStarted","Data":"90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c"} Jun 13 05:18:02 crc kubenswrapper[4894]: I0613 05:18:02.926735 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-8zw5m" event={"ID":"da3e072e-d199-4a17-aa3a-b9c11c049e31","Type":"ContainerStarted","Data":"62a9e7b2f0748d21601660f01c06a8663379f05f11cadc97a9220461d69a07c8"} Jun 13 05:18:02 crc kubenswrapper[4894]: I0613 05:18:02.943556 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-8zw5m" podStartSLOduration=1.943538514 podStartE2EDuration="1.943538514s" podCreationTimestamp="2025-06-13 05:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:18:02.942464354 +0000 UTC m=+1641.388711827" watchObservedRunningTime="2025-06-13 05:18:02.943538514 +0000 UTC m=+1641.389785987" Jun 13 05:18:03 crc kubenswrapper[4894]: I0613 05:18:03.043213 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vnlc"] Jun 13 05:18:03 crc kubenswrapper[4894]: I0613 05:18:03.049316 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vnlc"] Jun 13 05:18:04 crc kubenswrapper[4894]: I0613 05:18:04.296998 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c9d623-ef25-45c6-a9c8-f34f64c922cd" path="/var/lib/kubelet/pods/f9c9d623-ef25-45c6-a9c8-f34f64c922cd/volumes" Jun 13 05:18:09 crc kubenswrapper[4894]: I0613 05:18:09.276784 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:18:09 crc kubenswrapper[4894]: E0613 05:18:09.277881 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.268061 4894 scope.go:117] "RemoveContainer" containerID="60473c4bdbdb5c31f704e449ad05fc0f353847c193e5234c2a89c530ccc096ed" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.312180 4894 scope.go:117] "RemoveContainer" containerID="72acc3ef48c68ca48e5567f7ae89fb63a4cb5632da4e751ac791c51d234c886f" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.348997 4894 scope.go:117] "RemoveContainer" containerID="784e390bdb1c8752660f1f76613cd6f9e1b504aa70399033def4b41a7426f163" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.372576 4894 scope.go:117] "RemoveContainer" containerID="dbf9461b8c1db0f32df8cdd1d713125fe988052066556d56b5b06e1e2c1d948a" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.661351 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-8zw5m"] Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.661836 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-8zw5m" podUID="da3e072e-d199-4a17-aa3a-b9c11c049e31" containerName="container-00" containerID="cri-o://90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c" gracePeriod=2 Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.670958 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-8zw5m"] Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.770975 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8zw5m" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.948118 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host\") pod \"da3e072e-d199-4a17-aa3a-b9c11c049e31\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.948217 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host" (OuterVolumeSpecName: "host") pod "da3e072e-d199-4a17-aa3a-b9c11c049e31" (UID: "da3e072e-d199-4a17-aa3a-b9c11c049e31"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.948243 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj5lv\" (UniqueName: \"kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv\") pod \"da3e072e-d199-4a17-aa3a-b9c11c049e31\" (UID: \"da3e072e-d199-4a17-aa3a-b9c11c049e31\") " Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.948698 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3e072e-d199-4a17-aa3a-b9c11c049e31-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:12 crc kubenswrapper[4894]: I0613 05:18:12.954669 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv" (OuterVolumeSpecName: "kube-api-access-tj5lv") pod "da3e072e-d199-4a17-aa3a-b9c11c049e31" (UID: "da3e072e-d199-4a17-aa3a-b9c11c049e31"). InnerVolumeSpecName "kube-api-access-tj5lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.047126 4894 generic.go:334] "Generic (PLEG): container finished" podID="da3e072e-d199-4a17-aa3a-b9c11c049e31" containerID="90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c" exitCode=0 Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.047192 4894 scope.go:117] "RemoveContainer" containerID="90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c" Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.047314 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8zw5m" Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.050094 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj5lv\" (UniqueName: \"kubernetes.io/projected/da3e072e-d199-4a17-aa3a-b9c11c049e31-kube-api-access-tj5lv\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.080678 4894 scope.go:117] "RemoveContainer" containerID="90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c" Jun 13 05:18:13 crc kubenswrapper[4894]: E0613 05:18:13.081146 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c\": container with ID starting with 90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c not found: ID does not exist" containerID="90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c" Jun 13 05:18:13 crc kubenswrapper[4894]: I0613 05:18:13.081197 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c"} err="failed to get container status \"90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c\": rpc error: code = NotFound desc = could not find container \"90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c\": container with ID starting with 90c576710e5add83eaa1903d49f594d89b9ce27074f91bd98a3043d74d809f6c not found: ID does not exist" Jun 13 05:18:14 crc kubenswrapper[4894]: I0613 05:18:14.294496 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3e072e-d199-4a17-aa3a-b9c11c049e31" path="/var/lib/kubelet/pods/da3e072e-d199-4a17-aa3a-b9c11c049e31/volumes" Jun 13 05:18:15 crc kubenswrapper[4894]: I0613 05:18:15.071434 4894 generic.go:334] "Generic (PLEG): container finished" podID="44d8727f-e986-41ac-abb1-f33e1b395d03" containerID="2fc71543fea0c87f184bf2b24d3f67b589d1a15b69ccafaa720416738d5821c1" exitCode=0 Jun 13 05:18:15 crc kubenswrapper[4894]: I0613 05:18:15.071486 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" event={"ID":"44d8727f-e986-41ac-abb1-f33e1b395d03","Type":"ContainerDied","Data":"2fc71543fea0c87f184bf2b24d3f67b589d1a15b69ccafaa720416738d5821c1"} Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.489136 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.532264 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj2z6\" (UniqueName: \"kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6\") pod \"44d8727f-e986-41ac-abb1-f33e1b395d03\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.532366 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key\") pod \"44d8727f-e986-41ac-abb1-f33e1b395d03\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.533218 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory\") pod \"44d8727f-e986-41ac-abb1-f33e1b395d03\" (UID: \"44d8727f-e986-41ac-abb1-f33e1b395d03\") " Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.538481 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6" (OuterVolumeSpecName: "kube-api-access-hj2z6") pod "44d8727f-e986-41ac-abb1-f33e1b395d03" (UID: "44d8727f-e986-41ac-abb1-f33e1b395d03"). InnerVolumeSpecName "kube-api-access-hj2z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.566947 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44d8727f-e986-41ac-abb1-f33e1b395d03" (UID: "44d8727f-e986-41ac-abb1-f33e1b395d03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.568801 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory" (OuterVolumeSpecName: "inventory") pod "44d8727f-e986-41ac-abb1-f33e1b395d03" (UID: "44d8727f-e986-41ac-abb1-f33e1b395d03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.635991 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj2z6\" (UniqueName: \"kubernetes.io/projected/44d8727f-e986-41ac-abb1-f33e1b395d03-kube-api-access-hj2z6\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.636018 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:16 crc kubenswrapper[4894]: I0613 05:18:16.636029 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44d8727f-e986-41ac-abb1-f33e1b395d03-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.095334 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" event={"ID":"44d8727f-e986-41ac-abb1-f33e1b395d03","Type":"ContainerDied","Data":"c2b6834933e7fa725d4cf1987f2addcfcc5bbbccf537a182e9356c3b37144240"} Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.095368 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b6834933e7fa725d4cf1987f2addcfcc5bbbccf537a182e9356c3b37144240" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.095648 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.199223 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8"] Jun 13 05:18:17 crc kubenswrapper[4894]: E0613 05:18:17.199995 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3e072e-d199-4a17-aa3a-b9c11c049e31" containerName="container-00" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.200018 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3e072e-d199-4a17-aa3a-b9c11c049e31" containerName="container-00" Jun 13 05:18:17 crc kubenswrapper[4894]: E0613 05:18:17.200042 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d8727f-e986-41ac-abb1-f33e1b395d03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.200052 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d8727f-e986-41ac-abb1-f33e1b395d03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.200261 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3e072e-d199-4a17-aa3a-b9c11c049e31" containerName="container-00" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.200295 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d8727f-e986-41ac-abb1-f33e1b395d03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.204588 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.208449 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.208838 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.208883 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.209009 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.215329 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8"] Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.248410 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.248498 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvtr\" (UniqueName: \"kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.248660 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.349956 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvtr\" (UniqueName: \"kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.350092 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.350142 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.355524 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.363024 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.377563 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvtr\" (UniqueName: \"kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:17 crc kubenswrapper[4894]: I0613 05:18:17.531731 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:18 crc kubenswrapper[4894]: I0613 05:18:18.085012 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8"] Jun 13 05:18:19 crc kubenswrapper[4894]: I0613 05:18:19.115485 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" event={"ID":"fd5f7535-a39c-4891-85fe-f1def5f6f9a8","Type":"ContainerStarted","Data":"6e59e6b0f7ddd9ee5593290a812d2a22bc7fee543564cdb3e9a71aa07eaec332"} Jun 13 05:18:20 crc kubenswrapper[4894]: I0613 05:18:20.124970 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" event={"ID":"fd5f7535-a39c-4891-85fe-f1def5f6f9a8","Type":"ContainerStarted","Data":"08cfc161a38c832f689277ac45e13aa38ae4bdfab62ff3eb6b5a010bfac2c2bd"} Jun 13 05:18:20 crc kubenswrapper[4894]: I0613 05:18:20.161387 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" podStartSLOduration=2.08379969 podStartE2EDuration="3.161360716s" podCreationTimestamp="2025-06-13 05:18:17 +0000 UTC" firstStartedPulling="2025-06-13 05:18:18.105031552 +0000 UTC m=+1656.551279025" lastFinishedPulling="2025-06-13 05:18:19.182592548 +0000 UTC m=+1657.628840051" observedRunningTime="2025-06-13 05:18:20.154311706 +0000 UTC m=+1658.600559209" watchObservedRunningTime="2025-06-13 05:18:20.161360716 +0000 UTC m=+1658.607608209" Jun 13 05:18:21 crc kubenswrapper[4894]: I0613 05:18:21.276388 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:18:21 crc kubenswrapper[4894]: E0613 05:18:21.276901 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:18:29 crc kubenswrapper[4894]: I0613 05:18:29.223518 4894 generic.go:334] "Generic (PLEG): container finished" podID="fd5f7535-a39c-4891-85fe-f1def5f6f9a8" containerID="08cfc161a38c832f689277ac45e13aa38ae4bdfab62ff3eb6b5a010bfac2c2bd" exitCode=0 Jun 13 05:18:29 crc kubenswrapper[4894]: I0613 05:18:29.223681 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" event={"ID":"fd5f7535-a39c-4891-85fe-f1def5f6f9a8","Type":"ContainerDied","Data":"08cfc161a38c832f689277ac45e13aa38ae4bdfab62ff3eb6b5a010bfac2c2bd"} Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.696963 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.721888 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory\") pod \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.721992 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key\") pod \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.722055 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjvtr\" (UniqueName: \"kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr\") pod \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\" (UID: \"fd5f7535-a39c-4891-85fe-f1def5f6f9a8\") " Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.726513 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr" (OuterVolumeSpecName: "kube-api-access-tjvtr") pod "fd5f7535-a39c-4891-85fe-f1def5f6f9a8" (UID: "fd5f7535-a39c-4891-85fe-f1def5f6f9a8"). InnerVolumeSpecName "kube-api-access-tjvtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.747028 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory" (OuterVolumeSpecName: "inventory") pod "fd5f7535-a39c-4891-85fe-f1def5f6f9a8" (UID: "fd5f7535-a39c-4891-85fe-f1def5f6f9a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.749270 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd5f7535-a39c-4891-85fe-f1def5f6f9a8" (UID: "fd5f7535-a39c-4891-85fe-f1def5f6f9a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.824910 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjvtr\" (UniqueName: \"kubernetes.io/projected/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-kube-api-access-tjvtr\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.824956 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:30 crc kubenswrapper[4894]: I0613 05:18:30.824975 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd5f7535-a39c-4891-85fe-f1def5f6f9a8-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:18:31 crc kubenswrapper[4894]: I0613 05:18:31.258624 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" event={"ID":"fd5f7535-a39c-4891-85fe-f1def5f6f9a8","Type":"ContainerDied","Data":"6e59e6b0f7ddd9ee5593290a812d2a22bc7fee543564cdb3e9a71aa07eaec332"} Jun 13 05:18:31 crc kubenswrapper[4894]: I0613 05:18:31.259138 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e59e6b0f7ddd9ee5593290a812d2a22bc7fee543564cdb3e9a71aa07eaec332" Jun 13 05:18:31 crc kubenswrapper[4894]: I0613 05:18:31.258796 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8" Jun 13 05:18:32 crc kubenswrapper[4894]: I0613 05:18:32.283820 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:18:32 crc kubenswrapper[4894]: E0613 05:18:32.284120 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:18:47 crc kubenswrapper[4894]: I0613 05:18:47.278419 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:18:47 crc kubenswrapper[4894]: E0613 05:18:47.280105 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:19:01 crc kubenswrapper[4894]: I0613 05:19:01.277513 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:19:01 crc kubenswrapper[4894]: E0613 05:19:01.278935 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.101057 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-jvnss"] Jun 13 05:19:02 crc kubenswrapper[4894]: E0613 05:19:02.101716 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5f7535-a39c-4891-85fe-f1def5f6f9a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.101740 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5f7535-a39c-4891-85fe-f1def5f6f9a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.101952 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5f7535-a39c-4891-85fe-f1def5f6f9a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.102616 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.105458 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.175321 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.175429 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfph\" (UniqueName: \"kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.277524 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.277625 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.277775 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfph\" (UniqueName: \"kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.299906 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfph\" (UniqueName: \"kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph\") pod \"crc-debug-jvnss\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.418428 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvnss" Jun 13 05:19:02 crc kubenswrapper[4894]: I0613 05:19:02.605943 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jvnss" event={"ID":"94379eb5-911e-4617-8323-48d1ead5cbda","Type":"ContainerStarted","Data":"2d12bdd1f05fff3b75b2927e8519dfd6e0eed28a5271f2c65fc1d43b73d85627"} Jun 13 05:19:03 crc kubenswrapper[4894]: I0613 05:19:03.618598 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jvnss" event={"ID":"94379eb5-911e-4617-8323-48d1ead5cbda","Type":"ContainerStarted","Data":"10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94"} Jun 13 05:19:03 crc kubenswrapper[4894]: I0613 05:19:03.643805 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-jvnss" podStartSLOduration=1.643782503 podStartE2EDuration="1.643782503s" podCreationTimestamp="2025-06-13 05:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:19:03.63661103 +0000 UTC m=+1702.082858533" watchObservedRunningTime="2025-06-13 05:19:03.643782503 +0000 UTC m=+1702.090029996" Jun 13 05:19:12 crc kubenswrapper[4894]: I0613 05:19:12.916176 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-jvnss"] Jun 13 05:19:12 crc kubenswrapper[4894]: I0613 05:19:12.917297 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-jvnss" podUID="94379eb5-911e-4617-8323-48d1ead5cbda" containerName="container-00" containerID="cri-o://10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94" gracePeriod=2 Jun 13 05:19:12 crc kubenswrapper[4894]: I0613 05:19:12.920594 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-jvnss"] Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.029215 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvnss" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.097298 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfph\" (UniqueName: \"kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph\") pod \"94379eb5-911e-4617-8323-48d1ead5cbda\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.097542 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host\") pod \"94379eb5-911e-4617-8323-48d1ead5cbda\" (UID: \"94379eb5-911e-4617-8323-48d1ead5cbda\") " Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.097743 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host" (OuterVolumeSpecName: "host") pod "94379eb5-911e-4617-8323-48d1ead5cbda" (UID: "94379eb5-911e-4617-8323-48d1ead5cbda"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.098040 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94379eb5-911e-4617-8323-48d1ead5cbda-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.107039 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph" (OuterVolumeSpecName: "kube-api-access-pkfph") pod "94379eb5-911e-4617-8323-48d1ead5cbda" (UID: "94379eb5-911e-4617-8323-48d1ead5cbda"). InnerVolumeSpecName "kube-api-access-pkfph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.200167 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfph\" (UniqueName: \"kubernetes.io/projected/94379eb5-911e-4617-8323-48d1ead5cbda-kube-api-access-pkfph\") on node \"crc\" DevicePath \"\"" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.720483 4894 generic.go:334] "Generic (PLEG): container finished" podID="94379eb5-911e-4617-8323-48d1ead5cbda" containerID="10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94" exitCode=0 Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.720545 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jvnss" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.720568 4894 scope.go:117] "RemoveContainer" containerID="10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.749780 4894 scope.go:117] "RemoveContainer" containerID="10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94" Jun 13 05:19:13 crc kubenswrapper[4894]: E0613 05:19:13.750341 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94\": container with ID starting with 10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94 not found: ID does not exist" containerID="10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94" Jun 13 05:19:13 crc kubenswrapper[4894]: I0613 05:19:13.750404 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94"} err="failed to get container status \"10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94\": rpc error: code = NotFound desc = could not find container \"10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94\": container with ID starting with 10b8a2e567c9e8baa0e82b9678b7fbafa021218fb988f993c21b8624ed012e94 not found: ID does not exist" Jun 13 05:19:14 crc kubenswrapper[4894]: I0613 05:19:14.294884 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94379eb5-911e-4617-8323-48d1ead5cbda" path="/var/lib/kubelet/pods/94379eb5-911e-4617-8323-48d1ead5cbda/volumes" Jun 13 05:19:16 crc kubenswrapper[4894]: I0613 05:19:16.276814 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:19:16 crc kubenswrapper[4894]: E0613 05:19:16.277745 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:19:29 crc kubenswrapper[4894]: I0613 05:19:29.277534 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:19:29 crc kubenswrapper[4894]: E0613 05:19:29.279527 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:19:44 crc kubenswrapper[4894]: I0613 05:19:44.277770 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:19:44 crc kubenswrapper[4894]: E0613 05:19:44.281050 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:19:56 crc kubenswrapper[4894]: I0613 05:19:56.277890 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:19:56 crc kubenswrapper[4894]: E0613 05:19:56.279037 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.319915 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-vds2t"] Jun 13 05:20:02 crc kubenswrapper[4894]: E0613 05:20:02.320821 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94379eb5-911e-4617-8323-48d1ead5cbda" containerName="container-00" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.320838 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="94379eb5-911e-4617-8323-48d1ead5cbda" containerName="container-00" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.321046 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="94379eb5-911e-4617-8323-48d1ead5cbda" containerName="container-00" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.321763 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.325630 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.417745 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.417840 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcf7\" (UniqueName: \"kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.520099 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.520176 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcf7\" (UniqueName: \"kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.520719 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.547010 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcf7\" (UniqueName: \"kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7\") pod \"crc-debug-vds2t\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " pod="openstack/crc-debug-vds2t" Jun 13 05:20:02 crc kubenswrapper[4894]: I0613 05:20:02.645799 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vds2t" Jun 13 05:20:03 crc kubenswrapper[4894]: I0613 05:20:03.303943 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vds2t" event={"ID":"449b0a9f-09ba-4c15-98fd-8430b1cc0d96","Type":"ContainerStarted","Data":"f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868"} Jun 13 05:20:03 crc kubenswrapper[4894]: I0613 05:20:03.304789 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vds2t" event={"ID":"449b0a9f-09ba-4c15-98fd-8430b1cc0d96","Type":"ContainerStarted","Data":"b8d04e7a0fd44bc653f670ff57358ef244df1ecb6d7faad7d68d85a4bbcf23d1"} Jun 13 05:20:03 crc kubenswrapper[4894]: I0613 05:20:03.325095 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-vds2t" podStartSLOduration=1.325067526 podStartE2EDuration="1.325067526s" podCreationTimestamp="2025-06-13 05:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:20:03.321288649 +0000 UTC m=+1761.767536142" watchObservedRunningTime="2025-06-13 05:20:03.325067526 +0000 UTC m=+1761.771315029" Jun 13 05:20:08 crc kubenswrapper[4894]: I0613 05:20:08.279469 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:20:08 crc kubenswrapper[4894]: E0613 05:20:08.280013 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.179643 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-vds2t"] Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.180867 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-vds2t" podUID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" containerName="container-00" containerID="cri-o://f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868" gracePeriod=2 Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.190169 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-vds2t"] Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.291409 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vds2t" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.401789 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host\") pod \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.401843 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kcf7\" (UniqueName: \"kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7\") pod \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\" (UID: \"449b0a9f-09ba-4c15-98fd-8430b1cc0d96\") " Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.401953 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host" (OuterVolumeSpecName: "host") pod "449b0a9f-09ba-4c15-98fd-8430b1cc0d96" (UID: "449b0a9f-09ba-4c15-98fd-8430b1cc0d96"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.402336 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.410051 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7" (OuterVolumeSpecName: "kube-api-access-2kcf7") pod "449b0a9f-09ba-4c15-98fd-8430b1cc0d96" (UID: "449b0a9f-09ba-4c15-98fd-8430b1cc0d96"). InnerVolumeSpecName "kube-api-access-2kcf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.448524 4894 generic.go:334] "Generic (PLEG): container finished" podID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" containerID="f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868" exitCode=0 Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.448603 4894 scope.go:117] "RemoveContainer" containerID="f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.449855 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vds2t" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.471432 4894 scope.go:117] "RemoveContainer" containerID="f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868" Jun 13 05:20:13 crc kubenswrapper[4894]: E0613 05:20:13.471972 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868\": container with ID starting with f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868 not found: ID does not exist" containerID="f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.472011 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868"} err="failed to get container status \"f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868\": rpc error: code = NotFound desc = could not find container \"f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868\": container with ID starting with f7739628e280122c65170a9a190bd2adb26a5f52e0982f1f3a07ac4ad5364868 not found: ID does not exist" Jun 13 05:20:13 crc kubenswrapper[4894]: I0613 05:20:13.504287 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kcf7\" (UniqueName: \"kubernetes.io/projected/449b0a9f-09ba-4c15-98fd-8430b1cc0d96-kube-api-access-2kcf7\") on node \"crc\" DevicePath \"\"" Jun 13 05:20:14 crc kubenswrapper[4894]: I0613 05:20:14.294468 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" path="/var/lib/kubelet/pods/449b0a9f-09ba-4c15-98fd-8430b1cc0d96/volumes" Jun 13 05:20:21 crc kubenswrapper[4894]: I0613 05:20:21.277139 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:20:21 crc kubenswrapper[4894]: E0613 05:20:21.277941 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:20:35 crc kubenswrapper[4894]: I0613 05:20:35.276794 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:20:35 crc kubenswrapper[4894]: I0613 05:20:35.856149 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329"} Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.561860 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-5sc4v"] Jun 13 05:21:01 crc kubenswrapper[4894]: E0613 05:21:01.562996 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" containerName="container-00" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.563020 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" containerName="container-00" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.563314 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="449b0a9f-09ba-4c15-98fd-8430b1cc0d96" containerName="container-00" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.564387 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.568499 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.658971 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.659143 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzkq\" (UniqueName: \"kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.760511 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.760637 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzkq\" (UniqueName: \"kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.760727 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.796982 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzkq\" (UniqueName: \"kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq\") pod \"crc-debug-5sc4v\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " pod="openstack/crc-debug-5sc4v" Jun 13 05:21:01 crc kubenswrapper[4894]: I0613 05:21:01.894490 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5sc4v" Jun 13 05:21:02 crc kubenswrapper[4894]: I0613 05:21:02.156633 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5sc4v" event={"ID":"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82","Type":"ContainerStarted","Data":"1c76dbf5f238b743bb0c8d48df7cc30e6de89fd6db11084d9911157c293f88cf"} Jun 13 05:21:03 crc kubenswrapper[4894]: I0613 05:21:03.185711 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5sc4v" event={"ID":"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82","Type":"ContainerStarted","Data":"8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a"} Jun 13 05:21:03 crc kubenswrapper[4894]: I0613 05:21:03.214591 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-5sc4v" podStartSLOduration=2.214567763 podStartE2EDuration="2.214567763s" podCreationTimestamp="2025-06-13 05:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:21:03.208400058 +0000 UTC m=+1821.654647561" watchObservedRunningTime="2025-06-13 05:21:03.214567763 +0000 UTC m=+1821.660815256" Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.470054 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-5sc4v"] Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.470865 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-5sc4v" podUID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" containerName="container-00" containerID="cri-o://8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a" gracePeriod=2 Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.480643 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-5sc4v"] Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.558450 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5sc4v" Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.637834 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host\") pod \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.637919 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzkq\" (UniqueName: \"kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq\") pod \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\" (UID: \"97fb4a24-b284-4a7c-98c4-9e55e8b3cd82\") " Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.637948 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host" (OuterVolumeSpecName: "host") pod "97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" (UID: "97fb4a24-b284-4a7c-98c4-9e55e8b3cd82"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.638171 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.644481 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq" (OuterVolumeSpecName: "kube-api-access-nnzkq") pod "97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" (UID: "97fb4a24-b284-4a7c-98c4-9e55e8b3cd82"). InnerVolumeSpecName "kube-api-access-nnzkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:21:12 crc kubenswrapper[4894]: I0613 05:21:12.740757 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnzkq\" (UniqueName: \"kubernetes.io/projected/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82-kube-api-access-nnzkq\") on node \"crc\" DevicePath \"\"" Jun 13 05:21:13 crc kubenswrapper[4894]: I0613 05:21:13.348984 4894 generic.go:334] "Generic (PLEG): container finished" podID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" containerID="8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a" exitCode=0 Jun 13 05:21:13 crc kubenswrapper[4894]: I0613 05:21:13.349068 4894 scope.go:117] "RemoveContainer" containerID="8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a" Jun 13 05:21:13 crc kubenswrapper[4894]: I0613 05:21:13.349254 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5sc4v" Jun 13 05:21:13 crc kubenswrapper[4894]: I0613 05:21:13.391187 4894 scope.go:117] "RemoveContainer" containerID="8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a" Jun 13 05:21:13 crc kubenswrapper[4894]: E0613 05:21:13.391729 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a\": container with ID starting with 8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a not found: ID does not exist" containerID="8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a" Jun 13 05:21:13 crc kubenswrapper[4894]: I0613 05:21:13.391814 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a"} err="failed to get container status \"8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a\": rpc error: code = NotFound desc = could not find container \"8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a\": container with ID starting with 8135f40f4acd4d18bf451c4ab5bdebdb3121ea04aaa0d2597a1cb90869ebb35a not found: ID does not exist" Jun 13 05:21:14 crc kubenswrapper[4894]: I0613 05:21:14.297905 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" path="/var/lib/kubelet/pods/97fb4a24-b284-4a7c-98c4-9e55e8b3cd82/volumes" Jun 13 05:22:01 crc kubenswrapper[4894]: I0613 05:22:01.920242 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-flxvt"] Jun 13 05:22:01 crc kubenswrapper[4894]: E0613 05:22:01.921028 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" containerName="container-00" Jun 13 05:22:01 crc kubenswrapper[4894]: I0613 05:22:01.921040 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" containerName="container-00" Jun 13 05:22:01 crc kubenswrapper[4894]: I0613 05:22:01.921208 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fb4a24-b284-4a7c-98c4-9e55e8b3cd82" containerName="container-00" Jun 13 05:22:01 crc kubenswrapper[4894]: I0613 05:22:01.921726 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-flxvt" Jun 13 05:22:01 crc kubenswrapper[4894]: I0613 05:22:01.926286 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.044954 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99btb\" (UniqueName: \"kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.045412 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.150051 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99btb\" (UniqueName: \"kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.150216 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.150483 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.182669 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99btb\" (UniqueName: \"kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb\") pod \"crc-debug-flxvt\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.239542 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-flxvt" Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.856165 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-flxvt" event={"ID":"06c305ec-1c52-4677-9be8-f7934922aad5","Type":"ContainerStarted","Data":"f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e"} Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.856548 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-flxvt" event={"ID":"06c305ec-1c52-4677-9be8-f7934922aad5","Type":"ContainerStarted","Data":"f93a29276b49d2dc99847661e835fcddb501e20a169ca025c22a508b2377da2e"} Jun 13 05:22:02 crc kubenswrapper[4894]: I0613 05:22:02.883899 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-flxvt" podStartSLOduration=1.883881459 podStartE2EDuration="1.883881459s" podCreationTimestamp="2025-06-13 05:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:22:02.879480984 +0000 UTC m=+1881.325728457" watchObservedRunningTime="2025-06-13 05:22:02.883881459 +0000 UTC m=+1881.330128932" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.758467 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-flxvt"] Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.759808 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-flxvt" podUID="06c305ec-1c52-4677-9be8-f7934922aad5" containerName="container-00" containerID="cri-o://f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e" gracePeriod=2 Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.765994 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-flxvt"] Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.824016 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-flxvt" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.911812 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99btb\" (UniqueName: \"kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb\") pod \"06c305ec-1c52-4677-9be8-f7934922aad5\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.912200 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host\") pod \"06c305ec-1c52-4677-9be8-f7934922aad5\" (UID: \"06c305ec-1c52-4677-9be8-f7934922aad5\") " Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.912258 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host" (OuterVolumeSpecName: "host") pod "06c305ec-1c52-4677-9be8-f7934922aad5" (UID: "06c305ec-1c52-4677-9be8-f7934922aad5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.912787 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06c305ec-1c52-4677-9be8-f7934922aad5-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.923461 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb" (OuterVolumeSpecName: "kube-api-access-99btb") pod "06c305ec-1c52-4677-9be8-f7934922aad5" (UID: "06c305ec-1c52-4677-9be8-f7934922aad5"). InnerVolumeSpecName "kube-api-access-99btb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.985825 4894 generic.go:334] "Generic (PLEG): container finished" podID="06c305ec-1c52-4677-9be8-f7934922aad5" containerID="f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e" exitCode=0 Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.985876 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-flxvt" Jun 13 05:22:12 crc kubenswrapper[4894]: I0613 05:22:12.985891 4894 scope.go:117] "RemoveContainer" containerID="f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e" Jun 13 05:22:13 crc kubenswrapper[4894]: I0613 05:22:13.011679 4894 scope.go:117] "RemoveContainer" containerID="f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e" Jun 13 05:22:13 crc kubenswrapper[4894]: I0613 05:22:13.015028 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99btb\" (UniqueName: \"kubernetes.io/projected/06c305ec-1c52-4677-9be8-f7934922aad5-kube-api-access-99btb\") on node \"crc\" DevicePath \"\"" Jun 13 05:22:13 crc kubenswrapper[4894]: E0613 05:22:13.020473 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e\": container with ID starting with f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e not found: ID does not exist" containerID="f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e" Jun 13 05:22:13 crc kubenswrapper[4894]: I0613 05:22:13.020537 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e"} err="failed to get container status \"f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e\": rpc error: code = NotFound desc = could not find container \"f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e\": container with ID starting with f87313fbff8c807a7f5bdd338e764715a9779fa7424e4dbd4bd8f8cbcec8765e not found: ID does not exist" Jun 13 05:22:14 crc kubenswrapper[4894]: I0613 05:22:14.294844 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c305ec-1c52-4677-9be8-f7934922aad5" path="/var/lib/kubelet/pods/06c305ec-1c52-4677-9be8-f7934922aad5/volumes" Jun 13 05:22:56 crc kubenswrapper[4894]: I0613 05:22:56.236079 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:22:56 crc kubenswrapper[4894]: I0613 05:22:56.236864 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.210334 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-24g2r"] Jun 13 05:23:02 crc kubenswrapper[4894]: E0613 05:23:02.211725 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c305ec-1c52-4677-9be8-f7934922aad5" containerName="container-00" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.211750 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c305ec-1c52-4677-9be8-f7934922aad5" containerName="container-00" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.212091 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c305ec-1c52-4677-9be8-f7934922aad5" containerName="container-00" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.213104 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.215764 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.336804 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.336971 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqr99\" (UniqueName: \"kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.439127 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqr99\" (UniqueName: \"kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.439399 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.439759 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.473849 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqr99\" (UniqueName: \"kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99\") pod \"crc-debug-24g2r\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " pod="openstack/crc-debug-24g2r" Jun 13 05:23:02 crc kubenswrapper[4894]: I0613 05:23:02.543641 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-24g2r" Jun 13 05:23:03 crc kubenswrapper[4894]: I0613 05:23:03.518695 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-24g2r" event={"ID":"5754e49b-5d40-43ff-bb03-f52d55296c04","Type":"ContainerStarted","Data":"71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4"} Jun 13 05:23:03 crc kubenswrapper[4894]: I0613 05:23:03.518982 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-24g2r" event={"ID":"5754e49b-5d40-43ff-bb03-f52d55296c04","Type":"ContainerStarted","Data":"f9bd9a02b61101b181aab7fd80d44fc8cb83d62d272fd9e22b745e2122f26d44"} Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.179985 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-24g2r" podStartSLOduration=11.179958905 podStartE2EDuration="11.179958905s" podCreationTimestamp="2025-06-13 05:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:23:03.53714875 +0000 UTC m=+1941.983396223" watchObservedRunningTime="2025-06-13 05:23:13.179958905 +0000 UTC m=+1951.626206408" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.182394 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-24g2r"] Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.182758 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-24g2r" podUID="5754e49b-5d40-43ff-bb03-f52d55296c04" containerName="container-00" containerID="cri-o://71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4" gracePeriod=2 Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.190412 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-24g2r"] Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.284931 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-24g2r" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.401605 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host\") pod \"5754e49b-5d40-43ff-bb03-f52d55296c04\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.401826 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqr99\" (UniqueName: \"kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99\") pod \"5754e49b-5d40-43ff-bb03-f52d55296c04\" (UID: \"5754e49b-5d40-43ff-bb03-f52d55296c04\") " Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.402093 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host" (OuterVolumeSpecName: "host") pod "5754e49b-5d40-43ff-bb03-f52d55296c04" (UID: "5754e49b-5d40-43ff-bb03-f52d55296c04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.402306 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5754e49b-5d40-43ff-bb03-f52d55296c04-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.407830 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99" (OuterVolumeSpecName: "kube-api-access-qqr99") pod "5754e49b-5d40-43ff-bb03-f52d55296c04" (UID: "5754e49b-5d40-43ff-bb03-f52d55296c04"). InnerVolumeSpecName "kube-api-access-qqr99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.503020 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqr99\" (UniqueName: \"kubernetes.io/projected/5754e49b-5d40-43ff-bb03-f52d55296c04-kube-api-access-qqr99\") on node \"crc\" DevicePath \"\"" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.635869 4894 generic.go:334] "Generic (PLEG): container finished" podID="5754e49b-5d40-43ff-bb03-f52d55296c04" containerID="71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4" exitCode=0 Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.635924 4894 scope.go:117] "RemoveContainer" containerID="71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.635928 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-24g2r" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.655235 4894 scope.go:117] "RemoveContainer" containerID="71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4" Jun 13 05:23:13 crc kubenswrapper[4894]: E0613 05:23:13.655800 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4\": container with ID starting with 71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4 not found: ID does not exist" containerID="71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4" Jun 13 05:23:13 crc kubenswrapper[4894]: I0613 05:23:13.655835 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4"} err="failed to get container status \"71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4\": rpc error: code = NotFound desc = could not find container \"71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4\": container with ID starting with 71a1cd828c17d1101311ae2d178bbbb2b6692e4c464554e1bdd1b8c3193caca4 not found: ID does not exist" Jun 13 05:23:14 crc kubenswrapper[4894]: I0613 05:23:14.307467 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5754e49b-5d40-43ff-bb03-f52d55296c04" path="/var/lib/kubelet/pods/5754e49b-5d40-43ff-bb03-f52d55296c04/volumes" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.548454 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:20 crc kubenswrapper[4894]: E0613 05:23:20.549595 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5754e49b-5d40-43ff-bb03-f52d55296c04" containerName="container-00" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.549617 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5754e49b-5d40-43ff-bb03-f52d55296c04" containerName="container-00" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.549995 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5754e49b-5d40-43ff-bb03-f52d55296c04" containerName="container-00" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.552195 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.563400 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.655686 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.655746 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.655843 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mrv\" (UniqueName: \"kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.757092 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.757146 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.757234 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56mrv\" (UniqueName: \"kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.757745 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.757956 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.783761 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56mrv\" (UniqueName: \"kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv\") pod \"redhat-operators-6hgch\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:20 crc kubenswrapper[4894]: I0613 05:23:20.907396 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:21 crc kubenswrapper[4894]: I0613 05:23:21.394523 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:21 crc kubenswrapper[4894]: I0613 05:23:21.729675 4894 generic.go:334] "Generic (PLEG): container finished" podID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerID="f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1" exitCode=0 Jun 13 05:23:21 crc kubenswrapper[4894]: I0613 05:23:21.729754 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerDied","Data":"f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1"} Jun 13 05:23:21 crc kubenswrapper[4894]: I0613 05:23:21.729966 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerStarted","Data":"36aabb59028ac03c49c69591b844a43efe026f7ac4cbdb0ca6c1bbc1d3c22811"} Jun 13 05:23:21 crc kubenswrapper[4894]: I0613 05:23:21.732915 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:23:22 crc kubenswrapper[4894]: I0613 05:23:22.740147 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerStarted","Data":"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41"} Jun 13 05:23:23 crc kubenswrapper[4894]: I0613 05:23:23.753256 4894 generic.go:334] "Generic (PLEG): container finished" podID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerID="89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41" exitCode=0 Jun 13 05:23:23 crc kubenswrapper[4894]: I0613 05:23:23.753413 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerDied","Data":"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41"} Jun 13 05:23:24 crc kubenswrapper[4894]: I0613 05:23:24.773226 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerStarted","Data":"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b"} Jun 13 05:23:24 crc kubenswrapper[4894]: I0613 05:23:24.801212 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6hgch" podStartSLOduration=2.308926023 podStartE2EDuration="4.801188299s" podCreationTimestamp="2025-06-13 05:23:20 +0000 UTC" firstStartedPulling="2025-06-13 05:23:21.732522066 +0000 UTC m=+1960.178769559" lastFinishedPulling="2025-06-13 05:23:24.224784372 +0000 UTC m=+1962.671031835" observedRunningTime="2025-06-13 05:23:24.798545894 +0000 UTC m=+1963.244793367" watchObservedRunningTime="2025-06-13 05:23:24.801188299 +0000 UTC m=+1963.247435802" Jun 13 05:23:26 crc kubenswrapper[4894]: I0613 05:23:26.237287 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:23:26 crc kubenswrapper[4894]: I0613 05:23:26.237636 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:23:30 crc kubenswrapper[4894]: I0613 05:23:30.908028 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:30 crc kubenswrapper[4894]: I0613 05:23:30.909025 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:31 crc kubenswrapper[4894]: I0613 05:23:31.018569 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:31 crc kubenswrapper[4894]: I0613 05:23:31.930623 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:31 crc kubenswrapper[4894]: I0613 05:23:31.996410 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:33 crc kubenswrapper[4894]: I0613 05:23:33.863594 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6hgch" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="registry-server" containerID="cri-o://446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b" gracePeriod=2 Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.393386 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.469633 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56mrv\" (UniqueName: \"kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv\") pod \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.469726 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities\") pod \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.469785 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content\") pod \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\" (UID: \"0caf99eb-c242-4bef-8e1f-70a4ffb388c6\") " Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.471070 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities" (OuterVolumeSpecName: "utilities") pod "0caf99eb-c242-4bef-8e1f-70a4ffb388c6" (UID: "0caf99eb-c242-4bef-8e1f-70a4ffb388c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.475732 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv" (OuterVolumeSpecName: "kube-api-access-56mrv") pod "0caf99eb-c242-4bef-8e1f-70a4ffb388c6" (UID: "0caf99eb-c242-4bef-8e1f-70a4ffb388c6"). InnerVolumeSpecName "kube-api-access-56mrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.541634 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0caf99eb-c242-4bef-8e1f-70a4ffb388c6" (UID: "0caf99eb-c242-4bef-8e1f-70a4ffb388c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.571364 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56mrv\" (UniqueName: \"kubernetes.io/projected/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-kube-api-access-56mrv\") on node \"crc\" DevicePath \"\"" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.571400 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.571413 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caf99eb-c242-4bef-8e1f-70a4ffb388c6-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.878265 4894 generic.go:334] "Generic (PLEG): container finished" podID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerID="446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b" exitCode=0 Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.878345 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerDied","Data":"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b"} Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.878366 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hgch" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.878397 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hgch" event={"ID":"0caf99eb-c242-4bef-8e1f-70a4ffb388c6","Type":"ContainerDied","Data":"36aabb59028ac03c49c69591b844a43efe026f7ac4cbdb0ca6c1bbc1d3c22811"} Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.878432 4894 scope.go:117] "RemoveContainer" containerID="446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.923017 4894 scope.go:117] "RemoveContainer" containerID="89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41" Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.936298 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.951144 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6hgch"] Jun 13 05:23:34 crc kubenswrapper[4894]: I0613 05:23:34.957029 4894 scope.go:117] "RemoveContainer" containerID="f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.019744 4894 scope.go:117] "RemoveContainer" containerID="446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b" Jun 13 05:23:35 crc kubenswrapper[4894]: E0613 05:23:35.025147 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b\": container with ID starting with 446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b not found: ID does not exist" containerID="446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.025233 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b"} err="failed to get container status \"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b\": rpc error: code = NotFound desc = could not find container \"446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b\": container with ID starting with 446f42e3296e7589dae0cd089cd3992a66248a10f7cc2883f0ac7036efbab66b not found: ID does not exist" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.025289 4894 scope.go:117] "RemoveContainer" containerID="89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41" Jun 13 05:23:35 crc kubenswrapper[4894]: E0613 05:23:35.026293 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41\": container with ID starting with 89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41 not found: ID does not exist" containerID="89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.026565 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41"} err="failed to get container status \"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41\": rpc error: code = NotFound desc = could not find container \"89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41\": container with ID starting with 89bb11f6e295ca7164e9fb4363cf6071e269fc398f736a6e909be510d08aad41 not found: ID does not exist" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.026586 4894 scope.go:117] "RemoveContainer" containerID="f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1" Jun 13 05:23:35 crc kubenswrapper[4894]: E0613 05:23:35.027006 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1\": container with ID starting with f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1 not found: ID does not exist" containerID="f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1" Jun 13 05:23:35 crc kubenswrapper[4894]: I0613 05:23:35.027068 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1"} err="failed to get container status \"f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1\": rpc error: code = NotFound desc = could not find container \"f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1\": container with ID starting with f2705534ea581dacc7a6daa20895d6cc655757a008816b6a64135cb3f5b92de1 not found: ID does not exist" Jun 13 05:23:36 crc kubenswrapper[4894]: I0613 05:23:36.294726 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" path="/var/lib/kubelet/pods/0caf99eb-c242-4bef-8e1f-70a4ffb388c6/volumes" Jun 13 05:23:56 crc kubenswrapper[4894]: I0613 05:23:56.237012 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:23:56 crc kubenswrapper[4894]: I0613 05:23:56.237568 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:23:56 crc kubenswrapper[4894]: I0613 05:23:56.237618 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:23:56 crc kubenswrapper[4894]: I0613 05:23:56.238346 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:23:56 crc kubenswrapper[4894]: I0613 05:23:56.238410 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329" gracePeriod=600 Jun 13 05:23:57 crc kubenswrapper[4894]: I0613 05:23:57.125162 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329" exitCode=0 Jun 13 05:23:57 crc kubenswrapper[4894]: I0613 05:23:57.125253 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329"} Jun 13 05:23:57 crc kubenswrapper[4894]: I0613 05:23:57.125459 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a"} Jun 13 05:23:57 crc kubenswrapper[4894]: I0613 05:23:57.125485 4894 scope.go:117] "RemoveContainer" containerID="bda13e050bad96eea6a42fe6cf1d47e6ae084d39fb23dd1d76bfa41ba0893e4b" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.531641 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-tz89c"] Jun 13 05:24:01 crc kubenswrapper[4894]: E0613 05:24:01.533127 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="extract-utilities" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.533163 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="extract-utilities" Jun 13 05:24:01 crc kubenswrapper[4894]: E0613 05:24:01.533193 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="extract-content" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.533211 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="extract-content" Jun 13 05:24:01 crc kubenswrapper[4894]: E0613 05:24:01.533276 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="registry-server" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.533296 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="registry-server" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.533760 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0caf99eb-c242-4bef-8e1f-70a4ffb388c6" containerName="registry-server" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.534920 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.537699 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.583894 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8cb\" (UniqueName: \"kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.584030 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.685135 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.685404 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8cb\" (UniqueName: \"kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.685580 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.718989 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8cb\" (UniqueName: \"kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb\") pod \"crc-debug-tz89c\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: I0613 05:24:01.868108 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tz89c" Jun 13 05:24:01 crc kubenswrapper[4894]: W0613 05:24:01.910462 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8f25fd2_5f41_4d6b_9a91_dc3b972b1d38.slice/crio-6fb5a701a075c451a46f2a33783e7a673d9ad380f5f819d84539437e9a9a3a8a WatchSource:0}: Error finding container 6fb5a701a075c451a46f2a33783e7a673d9ad380f5f819d84539437e9a9a3a8a: Status 404 returned error can't find the container with id 6fb5a701a075c451a46f2a33783e7a673d9ad380f5f819d84539437e9a9a3a8a Jun 13 05:24:02 crc kubenswrapper[4894]: I0613 05:24:02.182283 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-tz89c" event={"ID":"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38","Type":"ContainerStarted","Data":"11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96"} Jun 13 05:24:02 crc kubenswrapper[4894]: I0613 05:24:02.182521 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-tz89c" event={"ID":"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38","Type":"ContainerStarted","Data":"6fb5a701a075c451a46f2a33783e7a673d9ad380f5f819d84539437e9a9a3a8a"} Jun 13 05:24:02 crc kubenswrapper[4894]: I0613 05:24:02.203225 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-tz89c" podStartSLOduration=1.203206002 podStartE2EDuration="1.203206002s" podCreationTimestamp="2025-06-13 05:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:24:02.19959461 +0000 UTC m=+2000.645842073" watchObservedRunningTime="2025-06-13 05:24:02.203206002 +0000 UTC m=+2000.649453455" Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.477209 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-tz89c"] Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.479315 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-tz89c" podUID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" containerName="container-00" containerID="cri-o://11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96" gracePeriod=2 Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.493526 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-tz89c"] Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.572697 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tz89c" Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.740426 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host\") pod \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.740649 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host" (OuterVolumeSpecName: "host") pod "a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" (UID: "a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.741111 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8cb\" (UniqueName: \"kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb\") pod \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\" (UID: \"a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38\") " Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.742160 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.750405 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb" (OuterVolumeSpecName: "kube-api-access-vx8cb") pod "a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" (UID: "a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38"). InnerVolumeSpecName "kube-api-access-vx8cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:24:12 crc kubenswrapper[4894]: I0613 05:24:12.844111 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8cb\" (UniqueName: \"kubernetes.io/projected/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38-kube-api-access-vx8cb\") on node \"crc\" DevicePath \"\"" Jun 13 05:24:13 crc kubenswrapper[4894]: I0613 05:24:13.283481 4894 generic.go:334] "Generic (PLEG): container finished" podID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" containerID="11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96" exitCode=0 Jun 13 05:24:13 crc kubenswrapper[4894]: I0613 05:24:13.283577 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tz89c" Jun 13 05:24:13 crc kubenswrapper[4894]: I0613 05:24:13.283633 4894 scope.go:117] "RemoveContainer" containerID="11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96" Jun 13 05:24:13 crc kubenswrapper[4894]: I0613 05:24:13.324819 4894 scope.go:117] "RemoveContainer" containerID="11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96" Jun 13 05:24:13 crc kubenswrapper[4894]: E0613 05:24:13.325725 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96\": container with ID starting with 11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96 not found: ID does not exist" containerID="11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96" Jun 13 05:24:13 crc kubenswrapper[4894]: I0613 05:24:13.325792 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96"} err="failed to get container status \"11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96\": rpc error: code = NotFound desc = could not find container \"11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96\": container with ID starting with 11d9c97b4c9f7d13582c3cd3e2935593648b1a26e80d0a4e0787b20c04581f96 not found: ID does not exist" Jun 13 05:24:14 crc kubenswrapper[4894]: I0613 05:24:14.294523 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" path="/var/lib/kubelet/pods/a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38/volumes" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.892339 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-lsp26"] Jun 13 05:25:01 crc kubenswrapper[4894]: E0613 05:25:01.893305 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" containerName="container-00" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.893321 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" containerName="container-00" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.893530 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f25fd2-5f41-4d6b-9a91-dc3b972b1d38" containerName="container-00" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.894242 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lsp26" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.902530 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.960959 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kjtj\" (UniqueName: \"kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:01 crc kubenswrapper[4894]: I0613 05:25:01.961331 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.063152 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kjtj\" (UniqueName: \"kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.063349 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.063522 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.109263 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kjtj\" (UniqueName: \"kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj\") pod \"crc-debug-lsp26\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.218515 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lsp26" Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.770935 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-lsp26" event={"ID":"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9","Type":"ContainerStarted","Data":"e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa"} Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.770982 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-lsp26" event={"ID":"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9","Type":"ContainerStarted","Data":"ec1720fe2e1b0107c63474b984bd19e57e5733da1b0d12f42d91ebd6a702f45d"} Jun 13 05:25:02 crc kubenswrapper[4894]: I0613 05:25:02.786110 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-lsp26" podStartSLOduration=1.786099533 podStartE2EDuration="1.786099533s" podCreationTimestamp="2025-06-13 05:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:25:02.785028792 +0000 UTC m=+2061.231276255" watchObservedRunningTime="2025-06-13 05:25:02.786099533 +0000 UTC m=+2061.232346996" Jun 13 05:25:04 crc kubenswrapper[4894]: E0613 05:25:04.965954 4894 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:55102->38.102.83.213:40951: write tcp 38.102.83.213:55102->38.102.83.213:40951: write: broken pipe Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.003337 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-lsp26"] Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.004262 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-lsp26" podUID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" containerName="container-00" containerID="cri-o://e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa" gracePeriod=2 Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.019378 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-lsp26"] Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.091859 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lsp26" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.193133 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kjtj\" (UniqueName: \"kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj\") pod \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.193195 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host\") pod \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\" (UID: \"5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9\") " Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.193461 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host" (OuterVolumeSpecName: "host") pod "5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" (UID: "5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.194056 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.208474 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj" (OuterVolumeSpecName: "kube-api-access-7kjtj") pod "5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" (UID: "5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9"). InnerVolumeSpecName "kube-api-access-7kjtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.296109 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kjtj\" (UniqueName: \"kubernetes.io/projected/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9-kube-api-access-7kjtj\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.906008 4894 generic.go:334] "Generic (PLEG): container finished" podID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" containerID="e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa" exitCode=0 Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.906053 4894 scope.go:117] "RemoveContainer" containerID="e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.906098 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lsp26" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.925153 4894 scope.go:117] "RemoveContainer" containerID="e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa" Jun 13 05:25:13 crc kubenswrapper[4894]: E0613 05:25:13.925638 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa\": container with ID starting with e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa not found: ID does not exist" containerID="e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa" Jun 13 05:25:13 crc kubenswrapper[4894]: I0613 05:25:13.925709 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa"} err="failed to get container status \"e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa\": rpc error: code = NotFound desc = could not find container \"e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa\": container with ID starting with e30e859d6ee3737be75a0b7184cd15c7c65c603ef0c1f5adc3304daebbe2e1fa not found: ID does not exist" Jun 13 05:25:14 crc kubenswrapper[4894]: I0613 05:25:14.290975 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" path="/var/lib/kubelet/pods/5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9/volumes" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.092418 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:21 crc kubenswrapper[4894]: E0613 05:25:21.093338 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" containerName="container-00" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.093355 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" containerName="container-00" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.093590 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9ae93a-4ad5-4a5e-b221-4fb33bfc23d9" containerName="container-00" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.095077 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.113609 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.156080 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.156162 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hlvs\" (UniqueName: \"kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.156184 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.257601 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.257678 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hlvs\" (UniqueName: \"kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.257699 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.258132 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.258379 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.297362 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hlvs\" (UniqueName: \"kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs\") pod \"certified-operators-ttst7\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.422154 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.667903 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-458gw"] Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.670167 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.678908 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-458gw"] Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.788768 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzxh\" (UniqueName: \"kubernetes.io/projected/ede0a450-b219-4eeb-a434-bf3ee2f699de-kube-api-access-pgzxh\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.788879 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-catalog-content\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.788898 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-utilities\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.890249 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzxh\" (UniqueName: \"kubernetes.io/projected/ede0a450-b219-4eeb-a434-bf3ee2f699de-kube-api-access-pgzxh\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.890359 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-utilities\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.890383 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-catalog-content\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.890853 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-utilities\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.890869 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ede0a450-b219-4eeb-a434-bf3ee2f699de-catalog-content\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.922059 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzxh\" (UniqueName: \"kubernetes.io/projected/ede0a450-b219-4eeb-a434-bf3ee2f699de-kube-api-access-pgzxh\") pod \"community-operators-458gw\" (UID: \"ede0a450-b219-4eeb-a434-bf3ee2f699de\") " pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.959958 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.988214 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerStarted","Data":"95fded439e3ea4e05f30e7b74f81b54891c0cc909cc8a807760cc46cb8a4893f"} Jun 13 05:25:21 crc kubenswrapper[4894]: I0613 05:25:21.989701 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:22 crc kubenswrapper[4894]: I0613 05:25:22.576665 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-458gw"] Jun 13 05:25:23 crc kubenswrapper[4894]: I0613 05:25:23.002064 4894 generic.go:334] "Generic (PLEG): container finished" podID="ede0a450-b219-4eeb-a434-bf3ee2f699de" containerID="e8310b51f378ac91166e01803511cdb5897fb76fbdb5b1076515246a05382576" exitCode=0 Jun 13 05:25:23 crc kubenswrapper[4894]: I0613 05:25:23.003766 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-458gw" event={"ID":"ede0a450-b219-4eeb-a434-bf3ee2f699de","Type":"ContainerDied","Data":"e8310b51f378ac91166e01803511cdb5897fb76fbdb5b1076515246a05382576"} Jun 13 05:25:23 crc kubenswrapper[4894]: I0613 05:25:23.003816 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-458gw" event={"ID":"ede0a450-b219-4eeb-a434-bf3ee2f699de","Type":"ContainerStarted","Data":"4317d72d3c0861cd6403908ff48d6183869fc2613ac1231ba3ed74bd72536380"} Jun 13 05:25:23 crc kubenswrapper[4894]: I0613 05:25:23.006848 4894 generic.go:334] "Generic (PLEG): container finished" podID="9677ec88-902d-4746-b470-36d04f74e568" containerID="1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058" exitCode=0 Jun 13 05:25:23 crc kubenswrapper[4894]: I0613 05:25:23.006885 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerDied","Data":"1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058"} Jun 13 05:25:24 crc kubenswrapper[4894]: I0613 05:25:24.025877 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerStarted","Data":"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65"} Jun 13 05:25:25 crc kubenswrapper[4894]: I0613 05:25:25.037999 4894 generic.go:334] "Generic (PLEG): container finished" podID="9677ec88-902d-4746-b470-36d04f74e568" containerID="91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65" exitCode=0 Jun 13 05:25:25 crc kubenswrapper[4894]: I0613 05:25:25.038130 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerDied","Data":"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65"} Jun 13 05:25:25 crc kubenswrapper[4894]: I0613 05:25:25.887628 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:25 crc kubenswrapper[4894]: I0613 05:25:25.891347 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:25 crc kubenswrapper[4894]: I0613 05:25:25.897541 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:25.976197 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:25.976256 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5llb9\" (UniqueName: \"kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:25.976326 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.077537 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.077600 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5llb9\" (UniqueName: \"kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.077681 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.078128 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.078344 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.094113 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5llb9\" (UniqueName: \"kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9\") pod \"redhat-marketplace-689g6\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.217504 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:26 crc kubenswrapper[4894]: I0613 05:25:26.680679 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:26 crc kubenswrapper[4894]: W0613 05:25:26.724316 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677fd655_48fb_45cb_8063_ed07a20429fd.slice/crio-1212ca05eba3f3df51ab552e2d239a9060b805abaef41187b27d6add36df8fda WatchSource:0}: Error finding container 1212ca05eba3f3df51ab552e2d239a9060b805abaef41187b27d6add36df8fda: Status 404 returned error can't find the container with id 1212ca05eba3f3df51ab552e2d239a9060b805abaef41187b27d6add36df8fda Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.065550 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerStarted","Data":"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38"} Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.065617 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerStarted","Data":"1212ca05eba3f3df51ab552e2d239a9060b805abaef41187b27d6add36df8fda"} Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.071281 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerStarted","Data":"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1"} Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.075122 4894 generic.go:334] "Generic (PLEG): container finished" podID="ede0a450-b219-4eeb-a434-bf3ee2f699de" containerID="01e27c87512c8b26f8cca4783fe91d479a58dc51a78fbbfe46408124278c8f25" exitCode=0 Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.075170 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-458gw" event={"ID":"ede0a450-b219-4eeb-a434-bf3ee2f699de","Type":"ContainerDied","Data":"01e27c87512c8b26f8cca4783fe91d479a58dc51a78fbbfe46408124278c8f25"} Jun 13 05:25:27 crc kubenswrapper[4894]: I0613 05:25:27.141202 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ttst7" podStartSLOduration=2.582604982 podStartE2EDuration="6.141182632s" podCreationTimestamp="2025-06-13 05:25:21 +0000 UTC" firstStartedPulling="2025-06-13 05:25:23.009053747 +0000 UTC m=+2081.455301250" lastFinishedPulling="2025-06-13 05:25:26.567631437 +0000 UTC m=+2085.013878900" observedRunningTime="2025-06-13 05:25:27.136433437 +0000 UTC m=+2085.582680910" watchObservedRunningTime="2025-06-13 05:25:27.141182632 +0000 UTC m=+2085.587430105" Jun 13 05:25:28 crc kubenswrapper[4894]: I0613 05:25:28.086601 4894 generic.go:334] "Generic (PLEG): container finished" podID="677fd655-48fb-45cb-8063-ed07a20429fd" containerID="1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38" exitCode=0 Jun 13 05:25:28 crc kubenswrapper[4894]: I0613 05:25:28.086703 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerDied","Data":"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38"} Jun 13 05:25:29 crc kubenswrapper[4894]: I0613 05:25:29.098511 4894 generic.go:334] "Generic (PLEG): container finished" podID="677fd655-48fb-45cb-8063-ed07a20429fd" containerID="41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c" exitCode=0 Jun 13 05:25:29 crc kubenswrapper[4894]: I0613 05:25:29.098565 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerDied","Data":"41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c"} Jun 13 05:25:29 crc kubenswrapper[4894]: I0613 05:25:29.104095 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-458gw" event={"ID":"ede0a450-b219-4eeb-a434-bf3ee2f699de","Type":"ContainerStarted","Data":"64667ce31cebd8bb0ffe86c7a4e051ec96dc83462b16b1a06e55762942262d37"} Jun 13 05:25:29 crc kubenswrapper[4894]: I0613 05:25:29.155868 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-458gw" podStartSLOduration=3.012514309 podStartE2EDuration="8.155847724s" podCreationTimestamp="2025-06-13 05:25:21 +0000 UTC" firstStartedPulling="2025-06-13 05:25:23.006160125 +0000 UTC m=+2081.452407628" lastFinishedPulling="2025-06-13 05:25:28.14949357 +0000 UTC m=+2086.595741043" observedRunningTime="2025-06-13 05:25:29.14901264 +0000 UTC m=+2087.595260113" watchObservedRunningTime="2025-06-13 05:25:29.155847724 +0000 UTC m=+2087.602095197" Jun 13 05:25:30 crc kubenswrapper[4894]: I0613 05:25:30.120637 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerStarted","Data":"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84"} Jun 13 05:25:30 crc kubenswrapper[4894]: I0613 05:25:30.147568 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-689g6" podStartSLOduration=3.564219626 podStartE2EDuration="5.147552181s" podCreationTimestamp="2025-06-13 05:25:25 +0000 UTC" firstStartedPulling="2025-06-13 05:25:28.088618346 +0000 UTC m=+2086.534865819" lastFinishedPulling="2025-06-13 05:25:29.671950871 +0000 UTC m=+2088.118198374" observedRunningTime="2025-06-13 05:25:30.142044305 +0000 UTC m=+2088.588291778" watchObservedRunningTime="2025-06-13 05:25:30.147552181 +0000 UTC m=+2088.593799644" Jun 13 05:25:31 crc kubenswrapper[4894]: I0613 05:25:31.422800 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:31 crc kubenswrapper[4894]: I0613 05:25:31.423363 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:31 crc kubenswrapper[4894]: I0613 05:25:31.506025 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:31 crc kubenswrapper[4894]: I0613 05:25:31.990682 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:31 crc kubenswrapper[4894]: I0613 05:25:31.990772 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:32 crc kubenswrapper[4894]: I0613 05:25:32.072726 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:32 crc kubenswrapper[4894]: I0613 05:25:32.234199 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:34 crc kubenswrapper[4894]: I0613 05:25:34.656922 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:34 crc kubenswrapper[4894]: I0613 05:25:34.657419 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ttst7" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="registry-server" containerID="cri-o://cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1" gracePeriod=2 Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.150309 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.176509 4894 generic.go:334] "Generic (PLEG): container finished" podID="9677ec88-902d-4746-b470-36d04f74e568" containerID="cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1" exitCode=0 Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.176557 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerDied","Data":"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1"} Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.176584 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ttst7" event={"ID":"9677ec88-902d-4746-b470-36d04f74e568","Type":"ContainerDied","Data":"95fded439e3ea4e05f30e7b74f81b54891c0cc909cc8a807760cc46cb8a4893f"} Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.176607 4894 scope.go:117] "RemoveContainer" containerID="cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.176747 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ttst7" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.197397 4894 scope.go:117] "RemoveContainer" containerID="91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.222611 4894 scope.go:117] "RemoveContainer" containerID="1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.256570 4894 scope.go:117] "RemoveContainer" containerID="cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1" Jun 13 05:25:35 crc kubenswrapper[4894]: E0613 05:25:35.257215 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1\": container with ID starting with cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1 not found: ID does not exist" containerID="cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.257291 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1"} err="failed to get container status \"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1\": rpc error: code = NotFound desc = could not find container \"cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1\": container with ID starting with cf630c8a1e6834eda396ab9550806f50341d5d248cba1f3e16ba281e1a05b9f1 not found: ID does not exist" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.257345 4894 scope.go:117] "RemoveContainer" containerID="91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65" Jun 13 05:25:35 crc kubenswrapper[4894]: E0613 05:25:35.257763 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65\": container with ID starting with 91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65 not found: ID does not exist" containerID="91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.257803 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65"} err="failed to get container status \"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65\": rpc error: code = NotFound desc = could not find container \"91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65\": container with ID starting with 91a7785c8119b40b0b1c0406aa3d05769b74c7a1c4ffa2739ad43d486be6df65 not found: ID does not exist" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.257830 4894 scope.go:117] "RemoveContainer" containerID="1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058" Jun 13 05:25:35 crc kubenswrapper[4894]: E0613 05:25:35.258142 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058\": container with ID starting with 1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058 not found: ID does not exist" containerID="1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.258184 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058"} err="failed to get container status \"1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058\": rpc error: code = NotFound desc = could not find container \"1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058\": container with ID starting with 1fa65a55babc82a8691803427129e1872ac8952913a6a2f9280d9e0b25e21058 not found: ID does not exist" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.345526 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content\") pod \"9677ec88-902d-4746-b470-36d04f74e568\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.345708 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hlvs\" (UniqueName: \"kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs\") pod \"9677ec88-902d-4746-b470-36d04f74e568\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.345844 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities\") pod \"9677ec88-902d-4746-b470-36d04f74e568\" (UID: \"9677ec88-902d-4746-b470-36d04f74e568\") " Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.346729 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities" (OuterVolumeSpecName: "utilities") pod "9677ec88-902d-4746-b470-36d04f74e568" (UID: "9677ec88-902d-4746-b470-36d04f74e568"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.351814 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs" (OuterVolumeSpecName: "kube-api-access-9hlvs") pod "9677ec88-902d-4746-b470-36d04f74e568" (UID: "9677ec88-902d-4746-b470-36d04f74e568"). InnerVolumeSpecName "kube-api-access-9hlvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.395183 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9677ec88-902d-4746-b470-36d04f74e568" (UID: "9677ec88-902d-4746-b470-36d04f74e568"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.447711 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.447738 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9677ec88-902d-4746-b470-36d04f74e568-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.447753 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hlvs\" (UniqueName: \"kubernetes.io/projected/9677ec88-902d-4746-b470-36d04f74e568-kube-api-access-9hlvs\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.515791 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:35 crc kubenswrapper[4894]: I0613 05:25:35.522452 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ttst7"] Jun 13 05:25:36 crc kubenswrapper[4894]: I0613 05:25:36.218850 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:36 crc kubenswrapper[4894]: I0613 05:25:36.219139 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:36 crc kubenswrapper[4894]: I0613 05:25:36.295813 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9677ec88-902d-4746-b470-36d04f74e568" path="/var/lib/kubelet/pods/9677ec88-902d-4746-b470-36d04f74e568/volumes" Jun 13 05:25:36 crc kubenswrapper[4894]: I0613 05:25:36.298435 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:37 crc kubenswrapper[4894]: I0613 05:25:37.287378 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:38 crc kubenswrapper[4894]: I0613 05:25:38.465481 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.228524 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-689g6" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="registry-server" containerID="cri-o://657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84" gracePeriod=2 Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.736261 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.838132 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities\") pod \"677fd655-48fb-45cb-8063-ed07a20429fd\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.838337 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content\") pod \"677fd655-48fb-45cb-8063-ed07a20429fd\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.838483 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5llb9\" (UniqueName: \"kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9\") pod \"677fd655-48fb-45cb-8063-ed07a20429fd\" (UID: \"677fd655-48fb-45cb-8063-ed07a20429fd\") " Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.839273 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities" (OuterVolumeSpecName: "utilities") pod "677fd655-48fb-45cb-8063-ed07a20429fd" (UID: "677fd655-48fb-45cb-8063-ed07a20429fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.848838 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9" (OuterVolumeSpecName: "kube-api-access-5llb9") pod "677fd655-48fb-45cb-8063-ed07a20429fd" (UID: "677fd655-48fb-45cb-8063-ed07a20429fd"). InnerVolumeSpecName "kube-api-access-5llb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.848896 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "677fd655-48fb-45cb-8063-ed07a20429fd" (UID: "677fd655-48fb-45cb-8063-ed07a20429fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.941740 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.941797 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5llb9\" (UniqueName: \"kubernetes.io/projected/677fd655-48fb-45cb-8063-ed07a20429fd-kube-api-access-5llb9\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:39 crc kubenswrapper[4894]: I0613 05:25:39.941818 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677fd655-48fb-45cb-8063-ed07a20429fd-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.245725 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-689g6" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.245724 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerDied","Data":"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84"} Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.245867 4894 scope.go:117] "RemoveContainer" containerID="657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.245716 4894 generic.go:334] "Generic (PLEG): container finished" podID="677fd655-48fb-45cb-8063-ed07a20429fd" containerID="657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84" exitCode=0 Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.247317 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-689g6" event={"ID":"677fd655-48fb-45cb-8063-ed07a20429fd","Type":"ContainerDied","Data":"1212ca05eba3f3df51ab552e2d239a9060b805abaef41187b27d6add36df8fda"} Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.290703 4894 scope.go:117] "RemoveContainer" containerID="41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.312336 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.320472 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-689g6"] Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.333051 4894 scope.go:117] "RemoveContainer" containerID="1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.377428 4894 scope.go:117] "RemoveContainer" containerID="657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84" Jun 13 05:25:40 crc kubenswrapper[4894]: E0613 05:25:40.379091 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84\": container with ID starting with 657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84 not found: ID does not exist" containerID="657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.379161 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84"} err="failed to get container status \"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84\": rpc error: code = NotFound desc = could not find container \"657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84\": container with ID starting with 657c1544e0049329c3207e8821867f70ace36c4817ee308d9f3eeca8d8a5bb84 not found: ID does not exist" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.379210 4894 scope.go:117] "RemoveContainer" containerID="41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c" Jun 13 05:25:40 crc kubenswrapper[4894]: E0613 05:25:40.379786 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c\": container with ID starting with 41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c not found: ID does not exist" containerID="41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.379895 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c"} err="failed to get container status \"41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c\": rpc error: code = NotFound desc = could not find container \"41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c\": container with ID starting with 41ff48dc2da18889020cd0281651e47e0f84b1bbde5e50b16cff6b612b99226c not found: ID does not exist" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.380038 4894 scope.go:117] "RemoveContainer" containerID="1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38" Jun 13 05:25:40 crc kubenswrapper[4894]: E0613 05:25:40.380847 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38\": container with ID starting with 1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38 not found: ID does not exist" containerID="1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38" Jun 13 05:25:40 crc kubenswrapper[4894]: I0613 05:25:40.380903 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38"} err="failed to get container status \"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38\": rpc error: code = NotFound desc = could not find container \"1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38\": container with ID starting with 1b4316d7f8dd6ecedfb080d60bf115118cb584bb4cc06ff84e824c3f92331e38 not found: ID does not exist" Jun 13 05:25:42 crc kubenswrapper[4894]: I0613 05:25:42.072166 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-458gw" Jun 13 05:25:42 crc kubenswrapper[4894]: I0613 05:25:42.293604 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" path="/var/lib/kubelet/pods/677fd655-48fb-45cb-8063-ed07a20429fd/volumes" Jun 13 05:25:42 crc kubenswrapper[4894]: I0613 05:25:42.699282 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-458gw"] Jun 13 05:25:42 crc kubenswrapper[4894]: I0613 05:25:42.866056 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 05:25:42 crc kubenswrapper[4894]: I0613 05:25:42.866422 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zf66l" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="registry-server" containerID="cri-o://43f4ed7693fb32c304ea9933edbe378d969271942642f73649450fd25df3408d" gracePeriod=2 Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.284267 4894 generic.go:334] "Generic (PLEG): container finished" podID="70994bbf-8b6d-476f-8125-191d4a08205e" containerID="43f4ed7693fb32c304ea9933edbe378d969271942642f73649450fd25df3408d" exitCode=0 Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.284415 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerDied","Data":"43f4ed7693fb32c304ea9933edbe378d969271942642f73649450fd25df3408d"} Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.284571 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf66l" event={"ID":"70994bbf-8b6d-476f-8125-191d4a08205e","Type":"ContainerDied","Data":"02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846"} Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.284587 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02eaa1e833cab18426bcb31e2fbd51e3e9fa5bdb0317908f611cb6c9eec24846" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.314305 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf66l" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.448153 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvccg\" (UniqueName: \"kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg\") pod \"70994bbf-8b6d-476f-8125-191d4a08205e\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.448251 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities\") pod \"70994bbf-8b6d-476f-8125-191d4a08205e\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.448319 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content\") pod \"70994bbf-8b6d-476f-8125-191d4a08205e\" (UID: \"70994bbf-8b6d-476f-8125-191d4a08205e\") " Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.449154 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities" (OuterVolumeSpecName: "utilities") pod "70994bbf-8b6d-476f-8125-191d4a08205e" (UID: "70994bbf-8b6d-476f-8125-191d4a08205e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.467835 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg" (OuterVolumeSpecName: "kube-api-access-mvccg") pod "70994bbf-8b6d-476f-8125-191d4a08205e" (UID: "70994bbf-8b6d-476f-8125-191d4a08205e"). InnerVolumeSpecName "kube-api-access-mvccg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.503215 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70994bbf-8b6d-476f-8125-191d4a08205e" (UID: "70994bbf-8b6d-476f-8125-191d4a08205e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.550067 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvccg\" (UniqueName: \"kubernetes.io/projected/70994bbf-8b6d-476f-8125-191d4a08205e-kube-api-access-mvccg\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.550093 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:43 crc kubenswrapper[4894]: I0613 05:25:43.550107 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70994bbf-8b6d-476f-8125-191d4a08205e-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:25:44 crc kubenswrapper[4894]: I0613 05:25:44.289385 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf66l" Jun 13 05:25:44 crc kubenswrapper[4894]: I0613 05:25:44.315753 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 05:25:44 crc kubenswrapper[4894]: I0613 05:25:44.320579 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zf66l"] Jun 13 05:25:46 crc kubenswrapper[4894]: I0613 05:25:46.295803 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" path="/var/lib/kubelet/pods/70994bbf-8b6d-476f-8125-191d4a08205e/volumes" Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.932852 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.943031 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.955309 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8nbmx"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.964134 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.973036 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4b5gp"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.982597 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.989256 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.994823 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4lsw"] Jun 13 05:25:52 crc kubenswrapper[4894]: I0613 05:25:52.999831 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8nbmx"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.004876 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.009667 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rntj8"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.014328 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.019112 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-qlqzn"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.023794 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tzznt"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.028376 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7kzf7"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.033193 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.038574 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qbdsw"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.043441 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w5hp2"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.048704 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424"] Jun 13 05:25:53 crc kubenswrapper[4894]: I0613 05:25:53.053298 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rm424"] Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.292732 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2464fbe8-feb0-4aa6-8985-da3207358c52" path="/var/lib/kubelet/pods/2464fbe8-feb0-4aa6-8985-da3207358c52/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.294740 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf" path="/var/lib/kubelet/pods/3f7df21e-290c-47ce-8c5d-bcf8a35dcfdf/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.296089 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d8727f-e986-41ac-abb1-f33e1b395d03" path="/var/lib/kubelet/pods/44d8727f-e986-41ac-abb1-f33e1b395d03/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.297488 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0327b0-6896-425c-9e62-d179c465ff04" path="/var/lib/kubelet/pods/7b0327b0-6896-425c-9e62-d179c465ff04/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.299999 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85491710-8923-482a-bd20-6e82b284a439" path="/var/lib/kubelet/pods/85491710-8923-482a-bd20-6e82b284a439/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.301385 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9451abf4-721b-4ea5-b14e-5ac8f9beb4f5" path="/var/lib/kubelet/pods/9451abf4-721b-4ea5-b14e-5ac8f9beb4f5/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.302765 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05db1ce-0491-40b6-a148-be6b414542bc" path="/var/lib/kubelet/pods/c05db1ce-0491-40b6-a148-be6b414542bc/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.305288 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d382942f-e4cf-4fa0-9331-f8dbf464cd2e" path="/var/lib/kubelet/pods/d382942f-e4cf-4fa0-9331-f8dbf464cd2e/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.306751 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26dfa3d-1615-40aa-9ede-01780eeaf5d9" path="/var/lib/kubelet/pods/e26dfa3d-1615-40aa-9ede-01780eeaf5d9/volumes" Jun 13 05:25:54 crc kubenswrapper[4894]: I0613 05:25:54.308285 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5f7535-a39c-4891-85fe-f1def5f6f9a8" path="/var/lib/kubelet/pods/fd5f7535-a39c-4891-85fe-f1def5f6f9a8/volumes" Jun 13 05:25:56 crc kubenswrapper[4894]: I0613 05:25:56.237106 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:25:56 crc kubenswrapper[4894]: I0613 05:25:56.237770 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.453112 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-5nx7z"] Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454333 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454359 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454390 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454403 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454424 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454437 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454456 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454468 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454497 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454510 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454525 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454537 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454553 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454565 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454585 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454597 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="extract-utilities" Jun 13 05:26:01 crc kubenswrapper[4894]: E0613 05:26:01.454626 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.454639 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="extract-content" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.455056 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="70994bbf-8b6d-476f-8125-191d4a08205e" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.455085 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9677ec88-902d-4746-b470-36d04f74e568" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.455111 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="677fd655-48fb-45cb-8063-ed07a20429fd" containerName="registry-server" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.456186 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.461009 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.607710 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.607794 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvg2\" (UniqueName: \"kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.710090 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.710213 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvg2\" (UniqueName: \"kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.710955 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.742699 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvg2\" (UniqueName: \"kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2\") pod \"crc-debug-5nx7z\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " pod="openstack/crc-debug-5nx7z" Jun 13 05:26:01 crc kubenswrapper[4894]: I0613 05:26:01.791888 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5nx7z" Jun 13 05:26:02 crc kubenswrapper[4894]: I0613 05:26:02.474869 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5nx7z" event={"ID":"0eef0587-0ff0-458d-a193-9e68a1e2a3a8","Type":"ContainerStarted","Data":"ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c"} Jun 13 05:26:02 crc kubenswrapper[4894]: I0613 05:26:02.475159 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5nx7z" event={"ID":"0eef0587-0ff0-458d-a193-9e68a1e2a3a8","Type":"ContainerStarted","Data":"49d97670d999231541d8f54206a087179fc67756c51878336c3347b2454141e8"} Jun 13 05:26:02 crc kubenswrapper[4894]: I0613 05:26:02.495984 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-5nx7z" podStartSLOduration=1.495972584 podStartE2EDuration="1.495972584s" podCreationTimestamp="2025-06-13 05:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:26:02.492873916 +0000 UTC m=+2120.939121419" watchObservedRunningTime="2025-06-13 05:26:02.495972584 +0000 UTC m=+2120.942220047" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.286849 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc"] Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.288282 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.291377 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.293187 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.293383 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.296469 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.298011 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc"] Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.298201 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.396085 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.396390 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.396486 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.396621 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.396761 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwmq\" (UniqueName: \"kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.498310 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwmq\" (UniqueName: \"kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.498600 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.498682 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.498718 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.498761 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.506278 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.506808 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.506917 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.508302 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.533277 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwmq\" (UniqueName: \"kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:06 crc kubenswrapper[4894]: I0613 05:26:06.607833 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:07 crc kubenswrapper[4894]: I0613 05:26:07.154750 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc"] Jun 13 05:26:07 crc kubenswrapper[4894]: I0613 05:26:07.531546 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" event={"ID":"c30d60b6-5327-4d21-b668-a5aa64265c8c","Type":"ContainerStarted","Data":"27df8ec2029ae44927e75e83598ef32547f3e3a9d3c9f6d49ef959dd468e40a0"} Jun 13 05:26:08 crc kubenswrapper[4894]: I0613 05:26:08.543279 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" event={"ID":"c30d60b6-5327-4d21-b668-a5aa64265c8c","Type":"ContainerStarted","Data":"c66836445f91492dff7d375bd5dec5bd049d0b6c0dd7ddcb6c47aacad01ede67"} Jun 13 05:26:08 crc kubenswrapper[4894]: I0613 05:26:08.561601 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" podStartSLOduration=2.033824303 podStartE2EDuration="2.561586881s" podCreationTimestamp="2025-06-13 05:26:06 +0000 UTC" firstStartedPulling="2025-06-13 05:26:07.180264977 +0000 UTC m=+2125.626512440" lastFinishedPulling="2025-06-13 05:26:07.708027555 +0000 UTC m=+2126.154275018" observedRunningTime="2025-06-13 05:26:08.559740559 +0000 UTC m=+2127.005988032" watchObservedRunningTime="2025-06-13 05:26:08.561586881 +0000 UTC m=+2127.007834344" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.419899 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-5nx7z"] Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.420473 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-5nx7z" podUID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" containerName="container-00" containerID="cri-o://ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c" gracePeriod=2 Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.439536 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-5nx7z"] Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.511681 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5nx7z" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.575483 4894 generic.go:334] "Generic (PLEG): container finished" podID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" containerID="ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c" exitCode=0 Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.575530 4894 scope.go:117] "RemoveContainer" containerID="ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.575630 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5nx7z" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.594964 4894 scope.go:117] "RemoveContainer" containerID="ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c" Jun 13 05:26:12 crc kubenswrapper[4894]: E0613 05:26:12.595317 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c\": container with ID starting with ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c not found: ID does not exist" containerID="ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.595375 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c"} err="failed to get container status \"ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c\": rpc error: code = NotFound desc = could not find container \"ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c\": container with ID starting with ef54715b5c3807713616880c4fac57db790f0397d2446681b18449d18d0c179c not found: ID does not exist" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.639523 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host\") pod \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.639693 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdvg2\" (UniqueName: \"kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2\") pod \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\" (UID: \"0eef0587-0ff0-458d-a193-9e68a1e2a3a8\") " Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.640699 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host" (OuterVolumeSpecName: "host") pod "0eef0587-0ff0-458d-a193-9e68a1e2a3a8" (UID: "0eef0587-0ff0-458d-a193-9e68a1e2a3a8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.645019 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2" (OuterVolumeSpecName: "kube-api-access-wdvg2") pod "0eef0587-0ff0-458d-a193-9e68a1e2a3a8" (UID: "0eef0587-0ff0-458d-a193-9e68a1e2a3a8"). InnerVolumeSpecName "kube-api-access-wdvg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.744451 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.744519 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdvg2\" (UniqueName: \"kubernetes.io/projected/0eef0587-0ff0-458d-a193-9e68a1e2a3a8-kube-api-access-wdvg2\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.948246 4894 scope.go:117] "RemoveContainer" containerID="69ee6700749dd71e2e81c635d15c667c783e272d4e390ab145e605c458ab0024" Jun 13 05:26:12 crc kubenswrapper[4894]: I0613 05:26:12.982263 4894 scope.go:117] "RemoveContainer" containerID="84e911935c044276d3428d53206576eea5aad9005bcd4369306125b4316ec46c" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.105839 4894 scope.go:117] "RemoveContainer" containerID="b0a5a0a6443e76bf98e6f1686942f702b8bf2ef709847d4c328d6acb8ed9b96d" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.151622 4894 scope.go:117] "RemoveContainer" containerID="7fec164af3701d73e78da5c63e2a0bd8b9db87d7d559a397fb0fa9fac82c7ca9" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.260024 4894 scope.go:117] "RemoveContainer" containerID="43f4ed7693fb32c304ea9933edbe378d969271942642f73649450fd25df3408d" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.281532 4894 scope.go:117] "RemoveContainer" containerID="1b0f2629782e9f14cb6e94ea221258f07df6750fe6590a262d8105761ba4a41f" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.341332 4894 scope.go:117] "RemoveContainer" containerID="06f374120fec460d44b20f490960f5eabf2185611be8e43b58038016b3113335" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.377691 4894 scope.go:117] "RemoveContainer" containerID="08cfc161a38c832f689277ac45e13aa38ae4bdfab62ff3eb6b5a010bfac2c2bd" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.403423 4894 scope.go:117] "RemoveContainer" containerID="2fc71543fea0c87f184bf2b24d3f67b589d1a15b69ccafaa720416738d5821c1" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.442414 4894 scope.go:117] "RemoveContainer" containerID="65ef03f03c1f7d50827d9379d9a0df505c65cd4c054cf948d7c3628fb443f688" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.499710 4894 scope.go:117] "RemoveContainer" containerID="aa9ac4e1616dc7478149e608f0962016af48800050803d0a9558949bd49d148b" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.566025 4894 scope.go:117] "RemoveContainer" containerID="38ad01742c7fbefdc6d3538cc0ca51c9c25dbc688fee254ed6295ac170e0a4f9" Jun 13 05:26:13 crc kubenswrapper[4894]: I0613 05:26:13.588830 4894 scope.go:117] "RemoveContainer" containerID="35086ac5d287e8d043392ae98b89e6ea03178239a86d2099fa3ad49e4a677c88" Jun 13 05:26:14 crc kubenswrapper[4894]: I0613 05:26:14.284490 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" path="/var/lib/kubelet/pods/0eef0587-0ff0-458d-a193-9e68a1e2a3a8/volumes" Jun 13 05:26:20 crc kubenswrapper[4894]: I0613 05:26:20.688754 4894 generic.go:334] "Generic (PLEG): container finished" podID="c30d60b6-5327-4d21-b668-a5aa64265c8c" containerID="c66836445f91492dff7d375bd5dec5bd049d0b6c0dd7ddcb6c47aacad01ede67" exitCode=0 Jun 13 05:26:20 crc kubenswrapper[4894]: I0613 05:26:20.688865 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" event={"ID":"c30d60b6-5327-4d21-b668-a5aa64265c8c","Type":"ContainerDied","Data":"c66836445f91492dff7d375bd5dec5bd049d0b6c0dd7ddcb6c47aacad01ede67"} Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.184786 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.256549 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtwmq\" (UniqueName: \"kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq\") pod \"c30d60b6-5327-4d21-b668-a5aa64265c8c\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.256599 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle\") pod \"c30d60b6-5327-4d21-b668-a5aa64265c8c\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.256687 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key\") pod \"c30d60b6-5327-4d21-b668-a5aa64265c8c\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.256738 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph\") pod \"c30d60b6-5327-4d21-b668-a5aa64265c8c\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.256834 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory\") pod \"c30d60b6-5327-4d21-b668-a5aa64265c8c\" (UID: \"c30d60b6-5327-4d21-b668-a5aa64265c8c\") " Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.266443 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq" (OuterVolumeSpecName: "kube-api-access-rtwmq") pod "c30d60b6-5327-4d21-b668-a5aa64265c8c" (UID: "c30d60b6-5327-4d21-b668-a5aa64265c8c"). InnerVolumeSpecName "kube-api-access-rtwmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.268829 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph" (OuterVolumeSpecName: "ceph") pod "c30d60b6-5327-4d21-b668-a5aa64265c8c" (UID: "c30d60b6-5327-4d21-b668-a5aa64265c8c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.269495 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c30d60b6-5327-4d21-b668-a5aa64265c8c" (UID: "c30d60b6-5327-4d21-b668-a5aa64265c8c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.308095 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c30d60b6-5327-4d21-b668-a5aa64265c8c" (UID: "c30d60b6-5327-4d21-b668-a5aa64265c8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.309028 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory" (OuterVolumeSpecName: "inventory") pod "c30d60b6-5327-4d21-b668-a5aa64265c8c" (UID: "c30d60b6-5327-4d21-b668-a5aa64265c8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.360580 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.360615 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.360629 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtwmq\" (UniqueName: \"kubernetes.io/projected/c30d60b6-5327-4d21-b668-a5aa64265c8c-kube-api-access-rtwmq\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.360644 4894 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.360702 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c30d60b6-5327-4d21-b668-a5aa64265c8c-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.714889 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" event={"ID":"c30d60b6-5327-4d21-b668-a5aa64265c8c","Type":"ContainerDied","Data":"27df8ec2029ae44927e75e83598ef32547f3e3a9d3c9f6d49ef959dd468e40a0"} Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.714945 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27df8ec2029ae44927e75e83598ef32547f3e3a9d3c9f6d49ef959dd468e40a0" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.715368 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.877533 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq"] Jun 13 05:26:22 crc kubenswrapper[4894]: E0613 05:26:22.878311 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30d60b6-5327-4d21-b668-a5aa64265c8c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.878330 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30d60b6-5327-4d21-b668-a5aa64265c8c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:26:22 crc kubenswrapper[4894]: E0613 05:26:22.878344 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" containerName="container-00" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.878353 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" containerName="container-00" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.878604 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eef0587-0ff0-458d-a193-9e68a1e2a3a8" containerName="container-00" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.878635 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30d60b6-5327-4d21-b668-a5aa64265c8c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.879373 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.884003 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.884504 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.885922 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.886091 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.886225 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.892831 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq"] Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.972851 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.972932 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmjw\" (UniqueName: \"kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.972980 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.973053 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:22 crc kubenswrapper[4894]: I0613 05:26:22.973070 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.074103 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.074154 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.074264 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.074313 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmjw\" (UniqueName: \"kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.074360 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.078532 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.078603 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.086455 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.087140 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.093687 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmjw\" (UniqueName: \"kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.195747 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:26:23 crc kubenswrapper[4894]: I0613 05:26:23.728670 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq"] Jun 13 05:26:23 crc kubenswrapper[4894]: W0613 05:26:23.736838 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1350090_3cce_492c_a445_b80a7ac7afbc.slice/crio-8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95 WatchSource:0}: Error finding container 8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95: Status 404 returned error can't find the container with id 8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95 Jun 13 05:26:24 crc kubenswrapper[4894]: I0613 05:26:24.732240 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" event={"ID":"c1350090-3cce-492c-a445-b80a7ac7afbc","Type":"ContainerStarted","Data":"0e1744855b614825e68e7ed8ef533cab1b206fd04011d2beb1680a55992d931e"} Jun 13 05:26:24 crc kubenswrapper[4894]: I0613 05:26:24.732527 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" event={"ID":"c1350090-3cce-492c-a445-b80a7ac7afbc","Type":"ContainerStarted","Data":"8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95"} Jun 13 05:26:24 crc kubenswrapper[4894]: I0613 05:26:24.755524 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" podStartSLOduration=2.198848755 podStartE2EDuration="2.755505622s" podCreationTimestamp="2025-06-13 05:26:22 +0000 UTC" firstStartedPulling="2025-06-13 05:26:23.739230167 +0000 UTC m=+2142.185477660" lastFinishedPulling="2025-06-13 05:26:24.295887024 +0000 UTC m=+2142.742134527" observedRunningTime="2025-06-13 05:26:24.750969713 +0000 UTC m=+2143.197217166" watchObservedRunningTime="2025-06-13 05:26:24.755505622 +0000 UTC m=+2143.201753085" Jun 13 05:26:26 crc kubenswrapper[4894]: I0613 05:26:26.236356 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:26:26 crc kubenswrapper[4894]: I0613 05:26:26.237887 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:26:56 crc kubenswrapper[4894]: I0613 05:26:56.236750 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:26:56 crc kubenswrapper[4894]: I0613 05:26:56.237449 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:26:56 crc kubenswrapper[4894]: I0613 05:26:56.237540 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:26:56 crc kubenswrapper[4894]: I0613 05:26:56.238363 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:26:56 crc kubenswrapper[4894]: I0613 05:26:56.238470 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" gracePeriod=600 Jun 13 05:26:56 crc kubenswrapper[4894]: E0613 05:26:56.370940 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:26:57 crc kubenswrapper[4894]: I0613 05:26:57.092011 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" exitCode=0 Jun 13 05:26:57 crc kubenswrapper[4894]: I0613 05:26:57.092066 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a"} Jun 13 05:26:57 crc kubenswrapper[4894]: I0613 05:26:57.092117 4894 scope.go:117] "RemoveContainer" containerID="bf4c8000196ead524be7b7942d536ce79180e6eb4ea11eb1b4d62e24aa656329" Jun 13 05:26:57 crc kubenswrapper[4894]: I0613 05:26:57.092932 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:26:57 crc kubenswrapper[4894]: E0613 05:26:57.093548 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:27:01 crc kubenswrapper[4894]: I0613 05:27:01.835490 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-fzjn7"] Jun 13 05:27:01 crc kubenswrapper[4894]: I0613 05:27:01.837941 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-fzjn7" Jun 13 05:27:01 crc kubenswrapper[4894]: I0613 05:27:01.844510 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:27:01 crc kubenswrapper[4894]: I0613 05:27:01.984280 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwh27\" (UniqueName: \"kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:01 crc kubenswrapper[4894]: I0613 05:27:01.984423 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:02 crc kubenswrapper[4894]: I0613 05:27:02.086361 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwh27\" (UniqueName: \"kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:02 crc kubenswrapper[4894]: I0613 05:27:02.086496 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:02 crc kubenswrapper[4894]: I0613 05:27:02.086718 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:02 crc kubenswrapper[4894]: I0613 05:27:02.113570 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwh27\" (UniqueName: \"kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27\") pod \"crc-debug-fzjn7\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " pod="openstack/crc-debug-fzjn7" Jun 13 05:27:02 crc kubenswrapper[4894]: I0613 05:27:02.165735 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-fzjn7" Jun 13 05:27:03 crc kubenswrapper[4894]: I0613 05:27:03.162028 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-fzjn7" event={"ID":"9078ef6c-a6aa-4514-9d11-2a9ec28c1593","Type":"ContainerStarted","Data":"589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4"} Jun 13 05:27:03 crc kubenswrapper[4894]: I0613 05:27:03.162728 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-fzjn7" event={"ID":"9078ef6c-a6aa-4514-9d11-2a9ec28c1593","Type":"ContainerStarted","Data":"7475be00ee71ee1cb3ce6c27f4ce3a7900e21f41f925c9d5190860a277dab211"} Jun 13 05:27:03 crc kubenswrapper[4894]: I0613 05:27:03.196613 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-fzjn7" podStartSLOduration=2.196586035 podStartE2EDuration="2.196586035s" podCreationTimestamp="2025-06-13 05:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:27:03.187367274 +0000 UTC m=+2181.633614767" watchObservedRunningTime="2025-06-13 05:27:03.196586035 +0000 UTC m=+2181.642833528" Jun 13 05:27:09 crc kubenswrapper[4894]: I0613 05:27:09.277999 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:27:09 crc kubenswrapper[4894]: E0613 05:27:09.279201 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:27:12 crc kubenswrapper[4894]: E0613 05:27:12.671892 4894 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:54048->38.102.83.213:40951: write tcp 38.102.83.213:54048->38.102.83.213:40951: write: broken pipe Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.703160 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-fzjn7"] Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.703493 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-fzjn7" podUID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" containerName="container-00" containerID="cri-o://589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4" gracePeriod=2 Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.715286 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-fzjn7"] Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.811572 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-fzjn7" Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.832572 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwh27\" (UniqueName: \"kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27\") pod \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.839532 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27" (OuterVolumeSpecName: "kube-api-access-cwh27") pod "9078ef6c-a6aa-4514-9d11-2a9ec28c1593" (UID: "9078ef6c-a6aa-4514-9d11-2a9ec28c1593"). InnerVolumeSpecName "kube-api-access-cwh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.934987 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host\") pod \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\" (UID: \"9078ef6c-a6aa-4514-9d11-2a9ec28c1593\") " Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.935457 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host" (OuterVolumeSpecName: "host") pod "9078ef6c-a6aa-4514-9d11-2a9ec28c1593" (UID: "9078ef6c-a6aa-4514-9d11-2a9ec28c1593"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:27:12 crc kubenswrapper[4894]: I0613 05:27:12.935627 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwh27\" (UniqueName: \"kubernetes.io/projected/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-kube-api-access-cwh27\") on node \"crc\" DevicePath \"\"" Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.038459 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9078ef6c-a6aa-4514-9d11-2a9ec28c1593-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.326648 4894 generic.go:334] "Generic (PLEG): container finished" podID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" containerID="589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4" exitCode=0 Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.326809 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-fzjn7" Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.326861 4894 scope.go:117] "RemoveContainer" containerID="589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4" Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.380666 4894 scope.go:117] "RemoveContainer" containerID="589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4" Jun 13 05:27:13 crc kubenswrapper[4894]: E0613 05:27:13.381087 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4\": container with ID starting with 589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4 not found: ID does not exist" containerID="589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4" Jun 13 05:27:13 crc kubenswrapper[4894]: I0613 05:27:13.381122 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4"} err="failed to get container status \"589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4\": rpc error: code = NotFound desc = could not find container \"589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4\": container with ID starting with 589030192f3bb8efd8c90a9f5b750e4785dff75a16e6a1c4c1e083035279e3f4 not found: ID does not exist" Jun 13 05:27:14 crc kubenswrapper[4894]: I0613 05:27:14.293255 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" path="/var/lib/kubelet/pods/9078ef6c-a6aa-4514-9d11-2a9ec28c1593/volumes" Jun 13 05:27:20 crc kubenswrapper[4894]: I0613 05:27:20.277735 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:27:20 crc kubenswrapper[4894]: E0613 05:27:20.278880 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:27:34 crc kubenswrapper[4894]: I0613 05:27:34.277456 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:27:34 crc kubenswrapper[4894]: E0613 05:27:34.278224 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:27:49 crc kubenswrapper[4894]: I0613 05:27:49.276836 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:27:49 crc kubenswrapper[4894]: E0613 05:27:49.277645 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:00 crc kubenswrapper[4894]: I0613 05:28:00.292734 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:28:00 crc kubenswrapper[4894]: E0613 05:28:00.293523 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.192793 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-cnx8n"] Jun 13 05:28:02 crc kubenswrapper[4894]: E0613 05:28:02.194754 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" containerName="container-00" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.194928 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" containerName="container-00" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.195378 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="9078ef6c-a6aa-4514-9d11-2a9ec28c1593" containerName="container-00" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.196469 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.202986 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.270932 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.271306 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vk5m\" (UniqueName: \"kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.373261 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.373324 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vk5m\" (UniqueName: \"kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.373472 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.393385 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vk5m\" (UniqueName: \"kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m\") pod \"crc-debug-cnx8n\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.529057 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-cnx8n" Jun 13 05:28:02 crc kubenswrapper[4894]: W0613 05:28:02.577489 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1ca0b8_3ad5_4eb4_afb8_4a633cc30929.slice/crio-a87ebe63b231746bd6b8c16fdfd66af313bddb9e8b0c967d73df305aaedb7310 WatchSource:0}: Error finding container a87ebe63b231746bd6b8c16fdfd66af313bddb9e8b0c967d73df305aaedb7310: Status 404 returned error can't find the container with id a87ebe63b231746bd6b8c16fdfd66af313bddb9e8b0c967d73df305aaedb7310 Jun 13 05:28:02 crc kubenswrapper[4894]: I0613 05:28:02.797542 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-cnx8n" event={"ID":"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929","Type":"ContainerStarted","Data":"a87ebe63b231746bd6b8c16fdfd66af313bddb9e8b0c967d73df305aaedb7310"} Jun 13 05:28:03 crc kubenswrapper[4894]: I0613 05:28:03.806331 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-cnx8n" event={"ID":"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929","Type":"ContainerStarted","Data":"b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa"} Jun 13 05:28:03 crc kubenswrapper[4894]: I0613 05:28:03.826620 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-cnx8n" podStartSLOduration=1.826601609 podStartE2EDuration="1.826601609s" podCreationTimestamp="2025-06-13 05:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:28:03.819418046 +0000 UTC m=+2242.265665549" watchObservedRunningTime="2025-06-13 05:28:03.826601609 +0000 UTC m=+2242.272849072" Jun 13 05:28:10 crc kubenswrapper[4894]: I0613 05:28:10.879106 4894 generic.go:334] "Generic (PLEG): container finished" podID="c1350090-3cce-492c-a445-b80a7ac7afbc" containerID="0e1744855b614825e68e7ed8ef533cab1b206fd04011d2beb1680a55992d931e" exitCode=0 Jun 13 05:28:10 crc kubenswrapper[4894]: I0613 05:28:10.879396 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" event={"ID":"c1350090-3cce-492c-a445-b80a7ac7afbc","Type":"ContainerDied","Data":"0e1744855b614825e68e7ed8ef533cab1b206fd04011d2beb1680a55992d931e"} Jun 13 05:28:11 crc kubenswrapper[4894]: I0613 05:28:11.277812 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:28:11 crc kubenswrapper[4894]: E0613 05:28:11.278550 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.401919 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.475371 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph\") pod \"c1350090-3cce-492c-a445-b80a7ac7afbc\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.475486 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle\") pod \"c1350090-3cce-492c-a445-b80a7ac7afbc\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.475556 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmjw\" (UniqueName: \"kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw\") pod \"c1350090-3cce-492c-a445-b80a7ac7afbc\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.475617 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key\") pod \"c1350090-3cce-492c-a445-b80a7ac7afbc\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.475706 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory\") pod \"c1350090-3cce-492c-a445-b80a7ac7afbc\" (UID: \"c1350090-3cce-492c-a445-b80a7ac7afbc\") " Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.480610 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c1350090-3cce-492c-a445-b80a7ac7afbc" (UID: "c1350090-3cce-492c-a445-b80a7ac7afbc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.481073 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph" (OuterVolumeSpecName: "ceph") pod "c1350090-3cce-492c-a445-b80a7ac7afbc" (UID: "c1350090-3cce-492c-a445-b80a7ac7afbc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.485306 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw" (OuterVolumeSpecName: "kube-api-access-fvmjw") pod "c1350090-3cce-492c-a445-b80a7ac7afbc" (UID: "c1350090-3cce-492c-a445-b80a7ac7afbc"). InnerVolumeSpecName "kube-api-access-fvmjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.500897 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory" (OuterVolumeSpecName: "inventory") pod "c1350090-3cce-492c-a445-b80a7ac7afbc" (UID: "c1350090-3cce-492c-a445-b80a7ac7afbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.502306 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1350090-3cce-492c-a445-b80a7ac7afbc" (UID: "c1350090-3cce-492c-a445-b80a7ac7afbc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.578533 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.578571 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.578586 4894 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.578602 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmjw\" (UniqueName: \"kubernetes.io/projected/c1350090-3cce-492c-a445-b80a7ac7afbc-kube-api-access-fvmjw\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.578623 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1350090-3cce-492c-a445-b80a7ac7afbc-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.906684 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" event={"ID":"c1350090-3cce-492c-a445-b80a7ac7afbc","Type":"ContainerDied","Data":"8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95"} Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.907037 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8265025447a9d4ac2f34555f98c9c30c6011fc05209174c20b8c7d67113d0e95" Jun 13 05:28:12 crc kubenswrapper[4894]: I0613 05:28:12.906812 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.063759 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc"] Jun 13 05:28:13 crc kubenswrapper[4894]: E0613 05:28:13.064363 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1350090-3cce-492c-a445-b80a7ac7afbc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.064398 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1350090-3cce-492c-a445-b80a7ac7afbc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.064716 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1350090-3cce-492c-a445-b80a7ac7afbc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.065753 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.068972 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.069244 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.071093 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.071579 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.071871 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.092540 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc"] Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.192769 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-cnx8n"] Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.193215 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-cnx8n" podUID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" containerName="container-00" containerID="cri-o://b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa" gracePeriod=2 Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.196225 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.196332 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.196402 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88d2\" (UniqueName: \"kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.196433 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.207808 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-cnx8n"] Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.257751 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-cnx8n" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.297622 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.297727 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88d2\" (UniqueName: \"kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.297763 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.297792 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.302054 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.302096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.307857 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.317705 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88d2\" (UniqueName: \"kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.395723 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.399794 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vk5m\" (UniqueName: \"kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m\") pod \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.400281 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host\") pod \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\" (UID: \"1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929\") " Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.401306 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host" (OuterVolumeSpecName: "host") pod "1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" (UID: "1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.409935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m" (OuterVolumeSpecName: "kube-api-access-4vk5m") pod "1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" (UID: "1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929"). InnerVolumeSpecName "kube-api-access-4vk5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.503113 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.503161 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vk5m\" (UniqueName: \"kubernetes.io/projected/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929-kube-api-access-4vk5m\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.922164 4894 generic.go:334] "Generic (PLEG): container finished" podID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" containerID="b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa" exitCode=0 Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.922260 4894 scope.go:117] "RemoveContainer" containerID="b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.922311 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-cnx8n" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.968616 4894 scope.go:117] "RemoveContainer" containerID="b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa" Jun 13 05:28:13 crc kubenswrapper[4894]: E0613 05:28:13.969265 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa\": container with ID starting with b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa not found: ID does not exist" containerID="b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa" Jun 13 05:28:13 crc kubenswrapper[4894]: I0613 05:28:13.969299 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa"} err="failed to get container status \"b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa\": rpc error: code = NotFound desc = could not find container \"b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa\": container with ID starting with b2578b4763fb7e22fc66925c911bedf3585f8684721cde9fecb76b34327c22fa not found: ID does not exist" Jun 13 05:28:14 crc kubenswrapper[4894]: I0613 05:28:14.008970 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc"] Jun 13 05:28:14 crc kubenswrapper[4894]: I0613 05:28:14.294678 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" path="/var/lib/kubelet/pods/1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929/volumes" Jun 13 05:28:14 crc kubenswrapper[4894]: I0613 05:28:14.934960 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" event={"ID":"d54bac12-88af-439e-8aab-def55120ac8f","Type":"ContainerStarted","Data":"257a0c2be01d01ecc68c66da48c8200359d04b1cd31f549bb006cb5f6d4cb5e8"} Jun 13 05:28:14 crc kubenswrapper[4894]: I0613 05:28:14.935286 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" event={"ID":"d54bac12-88af-439e-8aab-def55120ac8f","Type":"ContainerStarted","Data":"d42401563785845745a026f6965d1d34a05ea1776dd9dd32715a6dc6d194e167"} Jun 13 05:28:22 crc kubenswrapper[4894]: I0613 05:28:22.287047 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:28:22 crc kubenswrapper[4894]: E0613 05:28:22.287778 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:34 crc kubenswrapper[4894]: I0613 05:28:34.277038 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:28:34 crc kubenswrapper[4894]: E0613 05:28:34.277990 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:49 crc kubenswrapper[4894]: I0613 05:28:49.277172 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:28:49 crc kubenswrapper[4894]: E0613 05:28:49.278195 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:28:51 crc kubenswrapper[4894]: I0613 05:28:51.320343 4894 generic.go:334] "Generic (PLEG): container finished" podID="d54bac12-88af-439e-8aab-def55120ac8f" containerID="257a0c2be01d01ecc68c66da48c8200359d04b1cd31f549bb006cb5f6d4cb5e8" exitCode=0 Jun 13 05:28:51 crc kubenswrapper[4894]: I0613 05:28:51.320464 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" event={"ID":"d54bac12-88af-439e-8aab-def55120ac8f","Type":"ContainerDied","Data":"257a0c2be01d01ecc68c66da48c8200359d04b1cd31f549bb006cb5f6d4cb5e8"} Jun 13 05:28:52 crc kubenswrapper[4894]: I0613 05:28:52.800177 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:52 crc kubenswrapper[4894]: I0613 05:28:52.987504 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key\") pod \"d54bac12-88af-439e-8aab-def55120ac8f\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " Jun 13 05:28:52 crc kubenswrapper[4894]: I0613 05:28:52.988049 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory\") pod \"d54bac12-88af-439e-8aab-def55120ac8f\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " Jun 13 05:28:52 crc kubenswrapper[4894]: I0613 05:28:52.988090 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88d2\" (UniqueName: \"kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2\") pod \"d54bac12-88af-439e-8aab-def55120ac8f\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " Jun 13 05:28:52 crc kubenswrapper[4894]: I0613 05:28:52.988161 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph\") pod \"d54bac12-88af-439e-8aab-def55120ac8f\" (UID: \"d54bac12-88af-439e-8aab-def55120ac8f\") " Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.000822 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph" (OuterVolumeSpecName: "ceph") pod "d54bac12-88af-439e-8aab-def55120ac8f" (UID: "d54bac12-88af-439e-8aab-def55120ac8f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.004898 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2" (OuterVolumeSpecName: "kube-api-access-n88d2") pod "d54bac12-88af-439e-8aab-def55120ac8f" (UID: "d54bac12-88af-439e-8aab-def55120ac8f"). InnerVolumeSpecName "kube-api-access-n88d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.038746 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory" (OuterVolumeSpecName: "inventory") pod "d54bac12-88af-439e-8aab-def55120ac8f" (UID: "d54bac12-88af-439e-8aab-def55120ac8f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.044439 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d54bac12-88af-439e-8aab-def55120ac8f" (UID: "d54bac12-88af-439e-8aab-def55120ac8f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.090339 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.090379 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88d2\" (UniqueName: \"kubernetes.io/projected/d54bac12-88af-439e-8aab-def55120ac8f-kube-api-access-n88d2\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.090394 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.090405 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d54bac12-88af-439e-8aab-def55120ac8f-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.344177 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" event={"ID":"d54bac12-88af-439e-8aab-def55120ac8f","Type":"ContainerDied","Data":"d42401563785845745a026f6965d1d34a05ea1776dd9dd32715a6dc6d194e167"} Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.344231 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42401563785845745a026f6965d1d34a05ea1776dd9dd32715a6dc6d194e167" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.344595 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.479292 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm"] Jun 13 05:28:53 crc kubenswrapper[4894]: E0613 05:28:53.479668 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54bac12-88af-439e-8aab-def55120ac8f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.479688 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54bac12-88af-439e-8aab-def55120ac8f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:53 crc kubenswrapper[4894]: E0613 05:28:53.479714 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" containerName="container-00" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.479724 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" containerName="container-00" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.479928 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1ca0b8-3ad5-4eb4-afb8-4a633cc30929" containerName="container-00" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.479956 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54bac12-88af-439e-8aab-def55120ac8f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.480596 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.485246 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.485763 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.486083 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.493940 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.494025 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm"] Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.494228 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.599284 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.599354 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.599422 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bs7\" (UniqueName: \"kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.599578 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.702333 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.702606 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.702674 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.702709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bs7\" (UniqueName: \"kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.707137 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.708739 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.712859 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.723041 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bs7\" (UniqueName: \"kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:53 crc kubenswrapper[4894]: I0613 05:28:53.815187 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:28:54 crc kubenswrapper[4894]: I0613 05:28:54.125741 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm"] Jun 13 05:28:54 crc kubenswrapper[4894]: I0613 05:28:54.138281 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:28:54 crc kubenswrapper[4894]: I0613 05:28:54.355500 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" event={"ID":"f45b6e84-264a-43a0-8103-86b94fbbc5a5","Type":"ContainerStarted","Data":"852989c3afc6bbb804f683df7cf0f3867d8a880b2f396027f449663a0bfa5dbf"} Jun 13 05:28:55 crc kubenswrapper[4894]: I0613 05:28:55.368556 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" event={"ID":"f45b6e84-264a-43a0-8103-86b94fbbc5a5","Type":"ContainerStarted","Data":"a1859ee60d897effdac256fa43cc3e13d39a41fab51db9e60c63fd58f1f3a464"} Jun 13 05:28:55 crc kubenswrapper[4894]: I0613 05:28:55.388743 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" podStartSLOduration=1.958940307 podStartE2EDuration="2.38871645s" podCreationTimestamp="2025-06-13 05:28:53 +0000 UTC" firstStartedPulling="2025-06-13 05:28:54.138005526 +0000 UTC m=+2292.584252999" lastFinishedPulling="2025-06-13 05:28:54.567781649 +0000 UTC m=+2293.014029142" observedRunningTime="2025-06-13 05:28:55.387130655 +0000 UTC m=+2293.833378178" watchObservedRunningTime="2025-06-13 05:28:55.38871645 +0000 UTC m=+2293.834963953" Jun 13 05:29:00 crc kubenswrapper[4894]: I0613 05:29:00.434734 4894 generic.go:334] "Generic (PLEG): container finished" podID="f45b6e84-264a-43a0-8103-86b94fbbc5a5" containerID="a1859ee60d897effdac256fa43cc3e13d39a41fab51db9e60c63fd58f1f3a464" exitCode=0 Jun 13 05:29:00 crc kubenswrapper[4894]: I0613 05:29:00.434809 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" event={"ID":"f45b6e84-264a-43a0-8103-86b94fbbc5a5","Type":"ContainerDied","Data":"a1859ee60d897effdac256fa43cc3e13d39a41fab51db9e60c63fd58f1f3a464"} Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.639630 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-vfznm"] Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.640977 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.643367 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.770044 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.770124 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq65d\" (UniqueName: \"kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.871919 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.871996 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq65d\" (UniqueName: \"kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.872083 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.895796 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq65d\" (UniqueName: \"kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d\") pod \"crc-debug-vfznm\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " pod="openstack/crc-debug-vfznm" Jun 13 05:29:01 crc kubenswrapper[4894]: I0613 05:29:01.963494 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vfznm" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.037008 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.179751 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph\") pod \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.179840 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key\") pod \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.180042 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory\") pod \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.180088 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bs7\" (UniqueName: \"kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7\") pod \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\" (UID: \"f45b6e84-264a-43a0-8103-86b94fbbc5a5\") " Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.195011 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7" (OuterVolumeSpecName: "kube-api-access-z4bs7") pod "f45b6e84-264a-43a0-8103-86b94fbbc5a5" (UID: "f45b6e84-264a-43a0-8103-86b94fbbc5a5"). InnerVolumeSpecName "kube-api-access-z4bs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.201694 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph" (OuterVolumeSpecName: "ceph") pod "f45b6e84-264a-43a0-8103-86b94fbbc5a5" (UID: "f45b6e84-264a-43a0-8103-86b94fbbc5a5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.206483 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory" (OuterVolumeSpecName: "inventory") pod "f45b6e84-264a-43a0-8103-86b94fbbc5a5" (UID: "f45b6e84-264a-43a0-8103-86b94fbbc5a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.206723 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f45b6e84-264a-43a0-8103-86b94fbbc5a5" (UID: "f45b6e84-264a-43a0-8103-86b94fbbc5a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.282628 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.282678 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.282692 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f45b6e84-264a-43a0-8103-86b94fbbc5a5-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.282704 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bs7\" (UniqueName: \"kubernetes.io/projected/f45b6e84-264a-43a0-8103-86b94fbbc5a5-kube-api-access-z4bs7\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.289587 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:29:02 crc kubenswrapper[4894]: E0613 05:29:02.289852 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.459411 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" event={"ID":"f45b6e84-264a-43a0-8103-86b94fbbc5a5","Type":"ContainerDied","Data":"852989c3afc6bbb804f683df7cf0f3867d8a880b2f396027f449663a0bfa5dbf"} Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.459467 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="852989c3afc6bbb804f683df7cf0f3867d8a880b2f396027f449663a0bfa5dbf" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.459430 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.461002 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vfznm" event={"ID":"215ea9ff-ba1d-43da-94cb-4bf412600392","Type":"ContainerStarted","Data":"dc7497e967dc1ac6beb6400c658ff764181e275a36a506c83f773e96ee6839b5"} Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.461169 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-vfznm" event={"ID":"215ea9ff-ba1d-43da-94cb-4bf412600392","Type":"ContainerStarted","Data":"f78da07d1a093c30260fd749cd2699b3ecfb834013073023076a9dec92ad7ba3"} Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.483214 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-vfznm" podStartSLOduration=1.483199087 podStartE2EDuration="1.483199087s" podCreationTimestamp="2025-06-13 05:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:29:02.473809711 +0000 UTC m=+2300.920057184" watchObservedRunningTime="2025-06-13 05:29:02.483199087 +0000 UTC m=+2300.929446550" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.577686 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q"] Jun 13 05:29:02 crc kubenswrapper[4894]: E0613 05:29:02.578273 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45b6e84-264a-43a0-8103-86b94fbbc5a5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.578345 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45b6e84-264a-43a0-8103-86b94fbbc5a5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.578575 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f45b6e84-264a-43a0-8103-86b94fbbc5a5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.579147 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.581620 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.582022 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.582242 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.582426 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.584861 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.591520 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q"] Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.689166 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.689235 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965hd\" (UniqueName: \"kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.689286 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.689376 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.791131 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.791210 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-965hd\" (UniqueName: \"kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.791267 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.791359 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.796375 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.796931 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.797137 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.808842 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-965hd\" (UniqueName: \"kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znc5q\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:02 crc kubenswrapper[4894]: I0613 05:29:02.900260 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:03 crc kubenswrapper[4894]: I0613 05:29:03.546388 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q"] Jun 13 05:29:04 crc kubenswrapper[4894]: I0613 05:29:04.481204 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" event={"ID":"0ac5d831-55f4-495d-95ca-af1497a809e6","Type":"ContainerStarted","Data":"ef0fa6b0e42699364ff65ac281391287a0f50647d4b0c67af40cfbe7b416e181"} Jun 13 05:29:04 crc kubenswrapper[4894]: I0613 05:29:04.481532 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" event={"ID":"0ac5d831-55f4-495d-95ca-af1497a809e6","Type":"ContainerStarted","Data":"da358a720b31473025b7d5d2ecedddc46a91c610ffbb97067fa7616290eb9362"} Jun 13 05:29:04 crc kubenswrapper[4894]: I0613 05:29:04.503137 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" podStartSLOduration=2.062567329 podStartE2EDuration="2.503120587s" podCreationTimestamp="2025-06-13 05:29:02 +0000 UTC" firstStartedPulling="2025-06-13 05:29:03.535863302 +0000 UTC m=+2301.982110765" lastFinishedPulling="2025-06-13 05:29:03.97641656 +0000 UTC m=+2302.422664023" observedRunningTime="2025-06-13 05:29:04.501996315 +0000 UTC m=+2302.948243788" watchObservedRunningTime="2025-06-13 05:29:04.503120587 +0000 UTC m=+2302.949368060" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.471190 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-vfznm"] Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.471864 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-vfznm" podUID="215ea9ff-ba1d-43da-94cb-4bf412600392" containerName="container-00" containerID="cri-o://dc7497e967dc1ac6beb6400c658ff764181e275a36a506c83f773e96ee6839b5" gracePeriod=2 Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.484621 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-vfznm"] Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.565139 4894 generic.go:334] "Generic (PLEG): container finished" podID="215ea9ff-ba1d-43da-94cb-4bf412600392" containerID="dc7497e967dc1ac6beb6400c658ff764181e275a36a506c83f773e96ee6839b5" exitCode=0 Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.565482 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78da07d1a093c30260fd749cd2699b3ecfb834013073023076a9dec92ad7ba3" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.598265 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vfznm" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.693279 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq65d\" (UniqueName: \"kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d\") pod \"215ea9ff-ba1d-43da-94cb-4bf412600392\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.693419 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host\") pod \"215ea9ff-ba1d-43da-94cb-4bf412600392\" (UID: \"215ea9ff-ba1d-43da-94cb-4bf412600392\") " Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.693640 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host" (OuterVolumeSpecName: "host") pod "215ea9ff-ba1d-43da-94cb-4bf412600392" (UID: "215ea9ff-ba1d-43da-94cb-4bf412600392"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.694366 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215ea9ff-ba1d-43da-94cb-4bf412600392-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.701589 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d" (OuterVolumeSpecName: "kube-api-access-vq65d") pod "215ea9ff-ba1d-43da-94cb-4bf412600392" (UID: "215ea9ff-ba1d-43da-94cb-4bf412600392"). InnerVolumeSpecName "kube-api-access-vq65d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:29:12 crc kubenswrapper[4894]: I0613 05:29:12.796040 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq65d\" (UniqueName: \"kubernetes.io/projected/215ea9ff-ba1d-43da-94cb-4bf412600392-kube-api-access-vq65d\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:13 crc kubenswrapper[4894]: I0613 05:29:13.575789 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-vfznm" Jun 13 05:29:14 crc kubenswrapper[4894]: I0613 05:29:14.290655 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215ea9ff-ba1d-43da-94cb-4bf412600392" path="/var/lib/kubelet/pods/215ea9ff-ba1d-43da-94cb-4bf412600392/volumes" Jun 13 05:29:16 crc kubenswrapper[4894]: I0613 05:29:16.276609 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:29:16 crc kubenswrapper[4894]: E0613 05:29:16.277234 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:29:29 crc kubenswrapper[4894]: I0613 05:29:29.277163 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:29:29 crc kubenswrapper[4894]: E0613 05:29:29.278224 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:29:40 crc kubenswrapper[4894]: I0613 05:29:40.276832 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:29:40 crc kubenswrapper[4894]: E0613 05:29:40.277907 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:29:48 crc kubenswrapper[4894]: I0613 05:29:48.953508 4894 generic.go:334] "Generic (PLEG): container finished" podID="0ac5d831-55f4-495d-95ca-af1497a809e6" containerID="ef0fa6b0e42699364ff65ac281391287a0f50647d4b0c67af40cfbe7b416e181" exitCode=0 Jun 13 05:29:48 crc kubenswrapper[4894]: I0613 05:29:48.953554 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" event={"ID":"0ac5d831-55f4-495d-95ca-af1497a809e6","Type":"ContainerDied","Data":"ef0fa6b0e42699364ff65ac281391287a0f50647d4b0c67af40cfbe7b416e181"} Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.480301 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.603195 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph\") pod \"0ac5d831-55f4-495d-95ca-af1497a809e6\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.603249 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory\") pod \"0ac5d831-55f4-495d-95ca-af1497a809e6\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.603366 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key\") pod \"0ac5d831-55f4-495d-95ca-af1497a809e6\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.603401 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-965hd\" (UniqueName: \"kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd\") pod \"0ac5d831-55f4-495d-95ca-af1497a809e6\" (UID: \"0ac5d831-55f4-495d-95ca-af1497a809e6\") " Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.609551 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph" (OuterVolumeSpecName: "ceph") pod "0ac5d831-55f4-495d-95ca-af1497a809e6" (UID: "0ac5d831-55f4-495d-95ca-af1497a809e6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.610227 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd" (OuterVolumeSpecName: "kube-api-access-965hd") pod "0ac5d831-55f4-495d-95ca-af1497a809e6" (UID: "0ac5d831-55f4-495d-95ca-af1497a809e6"). InnerVolumeSpecName "kube-api-access-965hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.637908 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory" (OuterVolumeSpecName: "inventory") pod "0ac5d831-55f4-495d-95ca-af1497a809e6" (UID: "0ac5d831-55f4-495d-95ca-af1497a809e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.654645 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ac5d831-55f4-495d-95ca-af1497a809e6" (UID: "0ac5d831-55f4-495d-95ca-af1497a809e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.708516 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.708589 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.708609 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac5d831-55f4-495d-95ca-af1497a809e6-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.708628 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-965hd\" (UniqueName: \"kubernetes.io/projected/0ac5d831-55f4-495d-95ca-af1497a809e6-kube-api-access-965hd\") on node \"crc\" DevicePath \"\"" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.982547 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.982392 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znc5q" event={"ID":"0ac5d831-55f4-495d-95ca-af1497a809e6","Type":"ContainerDied","Data":"da358a720b31473025b7d5d2ecedddc46a91c610ffbb97067fa7616290eb9362"} Jun 13 05:29:50 crc kubenswrapper[4894]: I0613 05:29:50.983274 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da358a720b31473025b7d5d2ecedddc46a91c610ffbb97067fa7616290eb9362" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.100875 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58"] Jun 13 05:29:51 crc kubenswrapper[4894]: E0613 05:29:51.101267 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac5d831-55f4-495d-95ca-af1497a809e6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.101283 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac5d831-55f4-495d-95ca-af1497a809e6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:51 crc kubenswrapper[4894]: E0613 05:29:51.101296 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215ea9ff-ba1d-43da-94cb-4bf412600392" containerName="container-00" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.101302 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="215ea9ff-ba1d-43da-94cb-4bf412600392" containerName="container-00" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.101534 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac5d831-55f4-495d-95ca-af1497a809e6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.101548 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="215ea9ff-ba1d-43da-94cb-4bf412600392" containerName="container-00" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.102137 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.105306 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.105798 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.106145 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.106791 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.108457 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.120934 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58"] Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.223706 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.223762 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.223789 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.223877 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwwpf\" (UniqueName: \"kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.277607 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:29:51 crc kubenswrapper[4894]: E0613 05:29:51.278042 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.325906 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.326005 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.326056 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.326182 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwwpf\" (UniqueName: \"kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.330214 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.332469 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.333381 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.359788 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwwpf\" (UniqueName: \"kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:51 crc kubenswrapper[4894]: I0613 05:29:51.429298 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:29:52 crc kubenswrapper[4894]: I0613 05:29:52.100836 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58"] Jun 13 05:29:52 crc kubenswrapper[4894]: W0613 05:29:52.101538 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1732efba_04bc_4c2f_9966_7e1ff39add5c.slice/crio-0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225 WatchSource:0}: Error finding container 0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225: Status 404 returned error can't find the container with id 0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225 Jun 13 05:29:53 crc kubenswrapper[4894]: I0613 05:29:53.012096 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" event={"ID":"1732efba-04bc-4c2f-9966-7e1ff39add5c","Type":"ContainerStarted","Data":"4a84d8bd5c1732f05d4eb08f8748e128f66e8b73e316e83709043a30c742c10b"} Jun 13 05:29:53 crc kubenswrapper[4894]: I0613 05:29:53.012673 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" event={"ID":"1732efba-04bc-4c2f-9966-7e1ff39add5c","Type":"ContainerStarted","Data":"0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225"} Jun 13 05:29:53 crc kubenswrapper[4894]: I0613 05:29:53.040325 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" podStartSLOduration=1.401660159 podStartE2EDuration="2.040302577s" podCreationTimestamp="2025-06-13 05:29:51 +0000 UTC" firstStartedPulling="2025-06-13 05:29:52.104502542 +0000 UTC m=+2350.550750015" lastFinishedPulling="2025-06-13 05:29:52.74314496 +0000 UTC m=+2351.189392433" observedRunningTime="2025-06-13 05:29:53.030432027 +0000 UTC m=+2351.476679500" watchObservedRunningTime="2025-06-13 05:29:53.040302577 +0000 UTC m=+2351.486550050" Jun 13 05:29:59 crc kubenswrapper[4894]: I0613 05:29:59.066268 4894 generic.go:334] "Generic (PLEG): container finished" podID="1732efba-04bc-4c2f-9966-7e1ff39add5c" containerID="4a84d8bd5c1732f05d4eb08f8748e128f66e8b73e316e83709043a30c742c10b" exitCode=0 Jun 13 05:29:59 crc kubenswrapper[4894]: I0613 05:29:59.066406 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" event={"ID":"1732efba-04bc-4c2f-9966-7e1ff39add5c","Type":"ContainerDied","Data":"4a84d8bd5c1732f05d4eb08f8748e128f66e8b73e316e83709043a30c742c10b"} Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.163976 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj"] Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.166334 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.196699 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.197023 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.249813 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj"] Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.350744 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.350794 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.350849 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwb5\" (UniqueName: \"kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.452688 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwb5\" (UniqueName: \"kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.452864 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.452911 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.454068 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.458779 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.468235 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwb5\" (UniqueName: \"kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5\") pod \"collect-profiles-29163210-gwrkj\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.530181 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.609645 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.757667 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph\") pod \"1732efba-04bc-4c2f-9966-7e1ff39add5c\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.757753 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwwpf\" (UniqueName: \"kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf\") pod \"1732efba-04bc-4c2f-9966-7e1ff39add5c\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.757831 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key\") pod \"1732efba-04bc-4c2f-9966-7e1ff39add5c\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.757857 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory\") pod \"1732efba-04bc-4c2f-9966-7e1ff39add5c\" (UID: \"1732efba-04bc-4c2f-9966-7e1ff39add5c\") " Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.761564 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph" (OuterVolumeSpecName: "ceph") pod "1732efba-04bc-4c2f-9966-7e1ff39add5c" (UID: "1732efba-04bc-4c2f-9966-7e1ff39add5c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.779246 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf" (OuterVolumeSpecName: "kube-api-access-pwwpf") pod "1732efba-04bc-4c2f-9966-7e1ff39add5c" (UID: "1732efba-04bc-4c2f-9966-7e1ff39add5c"). InnerVolumeSpecName "kube-api-access-pwwpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.781387 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1732efba-04bc-4c2f-9966-7e1ff39add5c" (UID: "1732efba-04bc-4c2f-9966-7e1ff39add5c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.783681 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory" (OuterVolumeSpecName: "inventory") pod "1732efba-04bc-4c2f-9966-7e1ff39add5c" (UID: "1732efba-04bc-4c2f-9966-7e1ff39add5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.859781 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.859815 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwwpf\" (UniqueName: \"kubernetes.io/projected/1732efba-04bc-4c2f-9966-7e1ff39add5c-kube-api-access-pwwpf\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.859824 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.859835 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1732efba-04bc-4c2f-9966-7e1ff39add5c-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:00 crc kubenswrapper[4894]: I0613 05:30:00.994286 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj"] Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.082624 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" event={"ID":"10dff6d6-d204-4c97-a84a-0fa96d4484fb","Type":"ContainerStarted","Data":"4b96559acb818647803362546c6cfecde923d4fa5e0ea817b78fc23a57b58c4a"} Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.086137 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" event={"ID":"1732efba-04bc-4c2f-9966-7e1ff39add5c","Type":"ContainerDied","Data":"0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225"} Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.086177 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0321576cb197800948b5dd7fdfea4faa0392bdff0d9c4f40905c565391111225" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.086207 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.196613 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt"] Jun 13 05:30:01 crc kubenswrapper[4894]: E0613 05:30:01.197801 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1732efba-04bc-4c2f-9966-7e1ff39add5c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.197879 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1732efba-04bc-4c2f-9966-7e1ff39add5c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.198127 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1732efba-04bc-4c2f-9966-7e1ff39add5c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.198719 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.201513 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.205841 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.205900 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.205842 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.206162 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.207992 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt"] Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.369031 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.369117 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.369167 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.369216 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98f72\" (UniqueName: \"kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.470314 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.470397 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.470439 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.470506 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98f72\" (UniqueName: \"kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.476245 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.476970 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.478156 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.492435 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98f72\" (UniqueName: \"kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n26xt\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.529993 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.875914 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-tmmln"] Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.877260 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tmmln" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.879813 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.980752 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:01 crc kubenswrapper[4894]: I0613 05:30:01.980805 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qxl\" (UniqueName: \"kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.082100 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.082383 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qxl\" (UniqueName: \"kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.082263 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.097264 4894 generic.go:334] "Generic (PLEG): container finished" podID="10dff6d6-d204-4c97-a84a-0fa96d4484fb" containerID="a192374dfafc96ade870d68c863f3b3db4679b3521ff1cb46217fda5c1ab1e70" exitCode=0 Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.097322 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" event={"ID":"10dff6d6-d204-4c97-a84a-0fa96d4484fb","Type":"ContainerDied","Data":"a192374dfafc96ade870d68c863f3b3db4679b3521ff1cb46217fda5c1ab1e70"} Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.106953 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qxl\" (UniqueName: \"kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl\") pod \"crc-debug-tmmln\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " pod="openstack/crc-debug-tmmln" Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.135465 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt"] Jun 13 05:30:02 crc kubenswrapper[4894]: I0613 05:30:02.207528 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tmmln" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.107842 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" event={"ID":"5936607e-9f35-4b2d-95c2-abfba163575e","Type":"ContainerStarted","Data":"902b76dd8185a48de0d3775c57ed29399e2c0b479e44e240d91519e6aba64947"} Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.108502 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" event={"ID":"5936607e-9f35-4b2d-95c2-abfba163575e","Type":"ContainerStarted","Data":"d7a2f72283ffd770125fd4d986d6ba6a161ed2d4ec3e64f80d1a7b270a5dde6d"} Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.110637 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-tmmln" event={"ID":"c948c9a4-5825-49ad-a1cf-eaf8fcac430f","Type":"ContainerStarted","Data":"4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6"} Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.110699 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-tmmln" event={"ID":"c948c9a4-5825-49ad-a1cf-eaf8fcac430f","Type":"ContainerStarted","Data":"39a98bfb71e51ebc5284ac00c37f3f7c8214c5458c41d24cab9dc11c79d9a2dc"} Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.149387 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" podStartSLOduration=1.4864706509999999 podStartE2EDuration="2.149334045s" podCreationTimestamp="2025-06-13 05:30:01 +0000 UTC" firstStartedPulling="2025-06-13 05:30:02.139084271 +0000 UTC m=+2360.585331744" lastFinishedPulling="2025-06-13 05:30:02.801947665 +0000 UTC m=+2361.248195138" observedRunningTime="2025-06-13 05:30:03.126484047 +0000 UTC m=+2361.572731510" watchObservedRunningTime="2025-06-13 05:30:03.149334045 +0000 UTC m=+2361.595581508" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.155289 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-tmmln" podStartSLOduration=2.155274283 podStartE2EDuration="2.155274283s" podCreationTimestamp="2025-06-13 05:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:30:03.143901421 +0000 UTC m=+2361.590148894" watchObservedRunningTime="2025-06-13 05:30:03.155274283 +0000 UTC m=+2361.601521746" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.428451 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.608818 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwb5\" (UniqueName: \"kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5\") pod \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.608852 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume\") pod \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.608894 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume\") pod \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\" (UID: \"10dff6d6-d204-4c97-a84a-0fa96d4484fb\") " Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.610034 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "10dff6d6-d204-4c97-a84a-0fa96d4484fb" (UID: "10dff6d6-d204-4c97-a84a-0fa96d4484fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.614755 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10dff6d6-d204-4c97-a84a-0fa96d4484fb" (UID: "10dff6d6-d204-4c97-a84a-0fa96d4484fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.615431 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5" (OuterVolumeSpecName: "kube-api-access-bxwb5") pod "10dff6d6-d204-4c97-a84a-0fa96d4484fb" (UID: "10dff6d6-d204-4c97-a84a-0fa96d4484fb"). InnerVolumeSpecName "kube-api-access-bxwb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.710096 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwb5\" (UniqueName: \"kubernetes.io/projected/10dff6d6-d204-4c97-a84a-0fa96d4484fb-kube-api-access-bxwb5\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.710290 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dff6d6-d204-4c97-a84a-0fa96d4484fb-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:03 crc kubenswrapper[4894]: I0613 05:30:03.710389 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dff6d6-d204-4c97-a84a-0fa96d4484fb-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:04 crc kubenswrapper[4894]: I0613 05:30:04.120559 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" event={"ID":"10dff6d6-d204-4c97-a84a-0fa96d4484fb","Type":"ContainerDied","Data":"4b96559acb818647803362546c6cfecde923d4fa5e0ea817b78fc23a57b58c4a"} Jun 13 05:30:04 crc kubenswrapper[4894]: I0613 05:30:04.120605 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b96559acb818647803362546c6cfecde923d4fa5e0ea817b78fc23a57b58c4a" Jun 13 05:30:04 crc kubenswrapper[4894]: I0613 05:30:04.121015 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163210-gwrkj" Jun 13 05:30:04 crc kubenswrapper[4894]: I0613 05:30:04.544830 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd"] Jun 13 05:30:04 crc kubenswrapper[4894]: I0613 05:30:04.556811 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163165-4wqmd"] Jun 13 05:30:05 crc kubenswrapper[4894]: I0613 05:30:05.276974 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:30:05 crc kubenswrapper[4894]: E0613 05:30:05.277161 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:30:06 crc kubenswrapper[4894]: I0613 05:30:06.291758 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e188bf-fe34-404b-8bb3-ec0ca09e013d" path="/var/lib/kubelet/pods/58e188bf-fe34-404b-8bb3-ec0ca09e013d/volumes" Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.710881 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-tmmln"] Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.711873 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-tmmln" podUID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" containerName="container-00" containerID="cri-o://4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6" gracePeriod=2 Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.720100 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-tmmln"] Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.804259 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tmmln" Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.912848 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qxl\" (UniqueName: \"kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl\") pod \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.912937 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host\") pod \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\" (UID: \"c948c9a4-5825-49ad-a1cf-eaf8fcac430f\") " Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.913105 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host" (OuterVolumeSpecName: "host") pod "c948c9a4-5825-49ad-a1cf-eaf8fcac430f" (UID: "c948c9a4-5825-49ad-a1cf-eaf8fcac430f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.913578 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:12 crc kubenswrapper[4894]: I0613 05:30:12.921020 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl" (OuterVolumeSpecName: "kube-api-access-b5qxl") pod "c948c9a4-5825-49ad-a1cf-eaf8fcac430f" (UID: "c948c9a4-5825-49ad-a1cf-eaf8fcac430f"). InnerVolumeSpecName "kube-api-access-b5qxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.015913 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5qxl\" (UniqueName: \"kubernetes.io/projected/c948c9a4-5825-49ad-a1cf-eaf8fcac430f-kube-api-access-b5qxl\") on node \"crc\" DevicePath \"\"" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.216293 4894 generic.go:334] "Generic (PLEG): container finished" podID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" containerID="4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6" exitCode=0 Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.216386 4894 scope.go:117] "RemoveContainer" containerID="4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.216399 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-tmmln" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.265028 4894 scope.go:117] "RemoveContainer" containerID="4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6" Jun 13 05:30:13 crc kubenswrapper[4894]: E0613 05:30:13.266379 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6\": container with ID starting with 4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6 not found: ID does not exist" containerID="4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.266472 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6"} err="failed to get container status \"4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6\": rpc error: code = NotFound desc = could not find container \"4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6\": container with ID starting with 4f1068ea3b94c404a59e32e74dd60063b7d4a1125790a7b1996e60bc511337d6 not found: ID does not exist" Jun 13 05:30:13 crc kubenswrapper[4894]: I0613 05:30:13.954309 4894 scope.go:117] "RemoveContainer" containerID="ca4a81ad914b49013193ba58ff55bdd4f7d7cea44844f8e6d9abef1c28f93a8e" Jun 13 05:30:14 crc kubenswrapper[4894]: I0613 05:30:14.296555 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" path="/var/lib/kubelet/pods/c948c9a4-5825-49ad-a1cf-eaf8fcac430f/volumes" Jun 13 05:30:18 crc kubenswrapper[4894]: I0613 05:30:18.276882 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:30:18 crc kubenswrapper[4894]: E0613 05:30:18.279647 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:30:31 crc kubenswrapper[4894]: I0613 05:30:31.277158 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:30:31 crc kubenswrapper[4894]: E0613 05:30:31.278178 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:30:46 crc kubenswrapper[4894]: I0613 05:30:46.277352 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:30:46 crc kubenswrapper[4894]: E0613 05:30:46.278372 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:30:59 crc kubenswrapper[4894]: I0613 05:30:59.276376 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:30:59 crc kubenswrapper[4894]: E0613 05:30:59.277082 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:30:59 crc kubenswrapper[4894]: I0613 05:30:59.680772 4894 generic.go:334] "Generic (PLEG): container finished" podID="5936607e-9f35-4b2d-95c2-abfba163575e" containerID="902b76dd8185a48de0d3775c57ed29399e2c0b479e44e240d91519e6aba64947" exitCode=0 Jun 13 05:30:59 crc kubenswrapper[4894]: I0613 05:30:59.680893 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" event={"ID":"5936607e-9f35-4b2d-95c2-abfba163575e","Type":"ContainerDied","Data":"902b76dd8185a48de0d3775c57ed29399e2c0b479e44e240d91519e6aba64947"} Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.141892 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.193096 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98f72\" (UniqueName: \"kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72\") pod \"5936607e-9f35-4b2d-95c2-abfba163575e\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.193137 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key\") pod \"5936607e-9f35-4b2d-95c2-abfba163575e\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.193158 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory\") pod \"5936607e-9f35-4b2d-95c2-abfba163575e\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.193283 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph\") pod \"5936607e-9f35-4b2d-95c2-abfba163575e\" (UID: \"5936607e-9f35-4b2d-95c2-abfba163575e\") " Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.198276 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72" (OuterVolumeSpecName: "kube-api-access-98f72") pod "5936607e-9f35-4b2d-95c2-abfba163575e" (UID: "5936607e-9f35-4b2d-95c2-abfba163575e"). InnerVolumeSpecName "kube-api-access-98f72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.210690 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph" (OuterVolumeSpecName: "ceph") pod "5936607e-9f35-4b2d-95c2-abfba163575e" (UID: "5936607e-9f35-4b2d-95c2-abfba163575e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.218292 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory" (OuterVolumeSpecName: "inventory") pod "5936607e-9f35-4b2d-95c2-abfba163575e" (UID: "5936607e-9f35-4b2d-95c2-abfba163575e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.221805 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5936607e-9f35-4b2d-95c2-abfba163575e" (UID: "5936607e-9f35-4b2d-95c2-abfba163575e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.295059 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98f72\" (UniqueName: \"kubernetes.io/projected/5936607e-9f35-4b2d-95c2-abfba163575e-kube-api-access-98f72\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.295095 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.295111 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.295267 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5936607e-9f35-4b2d-95c2-abfba163575e-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.704122 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" event={"ID":"5936607e-9f35-4b2d-95c2-abfba163575e","Type":"ContainerDied","Data":"d7a2f72283ffd770125fd4d986d6ba6a161ed2d4ec3e64f80d1a7b270a5dde6d"} Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.704187 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7a2f72283ffd770125fd4d986d6ba6a161ed2d4ec3e64f80d1a7b270a5dde6d" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.704273 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n26xt" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.794565 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-nbbfz"] Jun 13 05:31:01 crc kubenswrapper[4894]: E0613 05:31:01.795189 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10dff6d6-d204-4c97-a84a-0fa96d4484fb" containerName="collect-profiles" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795219 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="10dff6d6-d204-4c97-a84a-0fa96d4484fb" containerName="collect-profiles" Jun 13 05:31:01 crc kubenswrapper[4894]: E0613 05:31:01.795268 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" containerName="container-00" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795281 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" containerName="container-00" Jun 13 05:31:01 crc kubenswrapper[4894]: E0613 05:31:01.795320 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5936607e-9f35-4b2d-95c2-abfba163575e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795336 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5936607e-9f35-4b2d-95c2-abfba163575e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795875 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5936607e-9f35-4b2d-95c2-abfba163575e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795912 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c948c9a4-5825-49ad-a1cf-eaf8fcac430f" containerName="container-00" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.795953 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="10dff6d6-d204-4c97-a84a-0fa96d4484fb" containerName="collect-profiles" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.796991 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.805149 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.805205 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.805503 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.805626 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.805802 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.817076 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-nbbfz"] Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.907318 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.907420 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.907459 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxwhm\" (UniqueName: \"kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:01 crc kubenswrapper[4894]: I0613 05:31:01.907518 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.008859 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.009317 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.009466 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.009527 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxwhm\" (UniqueName: \"kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.020119 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.021475 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.022399 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.042724 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxwhm\" (UniqueName: \"kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm\") pod \"ssh-known-hosts-edpm-deployment-nbbfz\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.072293 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-jgdcf"] Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.074497 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.078607 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.111090 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.111129 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghph5\" (UniqueName: \"kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.133505 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.214723 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.214798 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghph5\" (UniqueName: \"kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.214859 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.253973 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghph5\" (UniqueName: \"kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5\") pod \"crc-debug-jgdcf\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.420403 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jgdcf" Jun 13 05:31:02 crc kubenswrapper[4894]: W0613 05:31:02.457223 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d027132_f07a_4084_8c69_16246ddbc6a7.slice/crio-a67f84ff466dc0fa11e8300032db3551bab8153b0b0a16150a5f7e52ba1d6431 WatchSource:0}: Error finding container a67f84ff466dc0fa11e8300032db3551bab8153b0b0a16150a5f7e52ba1d6431: Status 404 returned error can't find the container with id a67f84ff466dc0fa11e8300032db3551bab8153b0b0a16150a5f7e52ba1d6431 Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.687701 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-nbbfz"] Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.718039 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" event={"ID":"f58c6e89-c103-43f0-b2a7-fcb21a7f677c","Type":"ContainerStarted","Data":"61940db9a6c0776018cc84fe6670bed1271a1afcd0bd1d1f5db1f53d82c1d833"} Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.719762 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jgdcf" event={"ID":"4d027132-f07a-4084-8c69-16246ddbc6a7","Type":"ContainerStarted","Data":"d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049"} Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.719787 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-jgdcf" event={"ID":"4d027132-f07a-4084-8c69-16246ddbc6a7","Type":"ContainerStarted","Data":"a67f84ff466dc0fa11e8300032db3551bab8153b0b0a16150a5f7e52ba1d6431"} Jun 13 05:31:02 crc kubenswrapper[4894]: I0613 05:31:02.742234 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-jgdcf" podStartSLOduration=0.742218132 podStartE2EDuration="742.218132ms" podCreationTimestamp="2025-06-13 05:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:31:02.734870274 +0000 UTC m=+2421.181117737" watchObservedRunningTime="2025-06-13 05:31:02.742218132 +0000 UTC m=+2421.188465595" Jun 13 05:31:03 crc kubenswrapper[4894]: I0613 05:31:03.730319 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" event={"ID":"f58c6e89-c103-43f0-b2a7-fcb21a7f677c","Type":"ContainerStarted","Data":"1df71aa16d5d8bf43dfc6577f21e299d665690a8c5f603195fed70c01fcac630"} Jun 13 05:31:03 crc kubenswrapper[4894]: I0613 05:31:03.759514 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" podStartSLOduration=2.325639677 podStartE2EDuration="2.759487725s" podCreationTimestamp="2025-06-13 05:31:01 +0000 UTC" firstStartedPulling="2025-06-13 05:31:02.696898949 +0000 UTC m=+2421.143146402" lastFinishedPulling="2025-06-13 05:31:03.130746997 +0000 UTC m=+2421.576994450" observedRunningTime="2025-06-13 05:31:03.753786733 +0000 UTC m=+2422.200034236" watchObservedRunningTime="2025-06-13 05:31:03.759487725 +0000 UTC m=+2422.205735218" Jun 13 05:31:12 crc kubenswrapper[4894]: I0613 05:31:12.293970 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:31:12 crc kubenswrapper[4894]: E0613 05:31:12.295147 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:31:12 crc kubenswrapper[4894]: I0613 05:31:12.981045 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-jgdcf"] Jun 13 05:31:12 crc kubenswrapper[4894]: I0613 05:31:12.981342 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-jgdcf" podUID="4d027132-f07a-4084-8c69-16246ddbc6a7" containerName="container-00" containerID="cri-o://d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049" gracePeriod=2 Jun 13 05:31:12 crc kubenswrapper[4894]: I0613 05:31:12.998548 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-jgdcf"] Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.052277 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jgdcf" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.242880 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghph5\" (UniqueName: \"kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5\") pod \"4d027132-f07a-4084-8c69-16246ddbc6a7\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.242981 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host\") pod \"4d027132-f07a-4084-8c69-16246ddbc6a7\" (UID: \"4d027132-f07a-4084-8c69-16246ddbc6a7\") " Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.243758 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host" (OuterVolumeSpecName: "host") pod "4d027132-f07a-4084-8c69-16246ddbc6a7" (UID: "4d027132-f07a-4084-8c69-16246ddbc6a7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.251483 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5" (OuterVolumeSpecName: "kube-api-access-ghph5") pod "4d027132-f07a-4084-8c69-16246ddbc6a7" (UID: "4d027132-f07a-4084-8c69-16246ddbc6a7"). InnerVolumeSpecName "kube-api-access-ghph5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.346055 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d027132-f07a-4084-8c69-16246ddbc6a7-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.346572 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghph5\" (UniqueName: \"kubernetes.io/projected/4d027132-f07a-4084-8c69-16246ddbc6a7-kube-api-access-ghph5\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.831030 4894 generic.go:334] "Generic (PLEG): container finished" podID="4d027132-f07a-4084-8c69-16246ddbc6a7" containerID="d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049" exitCode=0 Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.831110 4894 scope.go:117] "RemoveContainer" containerID="d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.831233 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-jgdcf" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.855293 4894 scope.go:117] "RemoveContainer" containerID="d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049" Jun 13 05:31:13 crc kubenswrapper[4894]: E0613 05:31:13.857187 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049\": container with ID starting with d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049 not found: ID does not exist" containerID="d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049" Jun 13 05:31:13 crc kubenswrapper[4894]: I0613 05:31:13.857233 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049"} err="failed to get container status \"d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049\": rpc error: code = NotFound desc = could not find container \"d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049\": container with ID starting with d2db6063df93899c58e2429dfeef0e3cf0ac275a97d18e7bc9bdbc8dcff44049 not found: ID does not exist" Jun 13 05:31:14 crc kubenswrapper[4894]: I0613 05:31:14.294974 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d027132-f07a-4084-8c69-16246ddbc6a7" path="/var/lib/kubelet/pods/4d027132-f07a-4084-8c69-16246ddbc6a7/volumes" Jun 13 05:31:14 crc kubenswrapper[4894]: I0613 05:31:14.845566 4894 generic.go:334] "Generic (PLEG): container finished" podID="f58c6e89-c103-43f0-b2a7-fcb21a7f677c" containerID="1df71aa16d5d8bf43dfc6577f21e299d665690a8c5f603195fed70c01fcac630" exitCode=0 Jun 13 05:31:14 crc kubenswrapper[4894]: I0613 05:31:14.845626 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" event={"ID":"f58c6e89-c103-43f0-b2a7-fcb21a7f677c","Type":"ContainerDied","Data":"1df71aa16d5d8bf43dfc6577f21e299d665690a8c5f603195fed70c01fcac630"} Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.396320 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.415846 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph\") pod \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.415909 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxwhm\" (UniqueName: \"kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm\") pod \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.415949 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam\") pod \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.415991 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0\") pod \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\" (UID: \"f58c6e89-c103-43f0-b2a7-fcb21a7f677c\") " Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.421212 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph" (OuterVolumeSpecName: "ceph") pod "f58c6e89-c103-43f0-b2a7-fcb21a7f677c" (UID: "f58c6e89-c103-43f0-b2a7-fcb21a7f677c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.421895 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm" (OuterVolumeSpecName: "kube-api-access-mxwhm") pod "f58c6e89-c103-43f0-b2a7-fcb21a7f677c" (UID: "f58c6e89-c103-43f0-b2a7-fcb21a7f677c"). InnerVolumeSpecName "kube-api-access-mxwhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.443383 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f58c6e89-c103-43f0-b2a7-fcb21a7f677c" (UID: "f58c6e89-c103-43f0-b2a7-fcb21a7f677c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.447035 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f58c6e89-c103-43f0-b2a7-fcb21a7f677c" (UID: "f58c6e89-c103-43f0-b2a7-fcb21a7f677c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.517264 4894 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-inventory-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.517304 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.517317 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxwhm\" (UniqueName: \"kubernetes.io/projected/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-kube-api-access-mxwhm\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.517330 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f58c6e89-c103-43f0-b2a7-fcb21a7f677c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.867861 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" event={"ID":"f58c6e89-c103-43f0-b2a7-fcb21a7f677c","Type":"ContainerDied","Data":"61940db9a6c0776018cc84fe6670bed1271a1afcd0bd1d1f5db1f53d82c1d833"} Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.868211 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61940db9a6c0776018cc84fe6670bed1271a1afcd0bd1d1f5db1f53d82c1d833" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.867944 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-nbbfz" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.963683 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f"] Jun 13 05:31:16 crc kubenswrapper[4894]: E0613 05:31:16.964034 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58c6e89-c103-43f0-b2a7-fcb21a7f677c" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.964056 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58c6e89-c103-43f0-b2a7-fcb21a7f677c" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:31:16 crc kubenswrapper[4894]: E0613 05:31:16.964086 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d027132-f07a-4084-8c69-16246ddbc6a7" containerName="container-00" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.964093 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d027132-f07a-4084-8c69-16246ddbc6a7" containerName="container-00" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.964253 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d027132-f07a-4084-8c69-16246ddbc6a7" containerName="container-00" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.964284 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58c6e89-c103-43f0-b2a7-fcb21a7f677c" containerName="ssh-known-hosts-edpm-deployment" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.964858 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.973363 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.973394 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.973712 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.975164 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.975862 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:31:16 crc kubenswrapper[4894]: I0613 05:31:16.976450 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f"] Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.126751 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.127221 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.127255 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx9n2\" (UniqueName: \"kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.127287 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.229412 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.229548 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.229692 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.229716 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx9n2\" (UniqueName: \"kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.238304 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.239805 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.241165 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.262153 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx9n2\" (UniqueName: \"kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-v454f\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:17 crc kubenswrapper[4894]: I0613 05:31:17.285169 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:18 crc kubenswrapper[4894]: I0613 05:31:18.091485 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f"] Jun 13 05:31:18 crc kubenswrapper[4894]: I0613 05:31:18.885922 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" event={"ID":"8f396ee0-4caf-4c2c-a060-e46767e338c9","Type":"ContainerStarted","Data":"cecba0cb06d8d349b3ade174a123c6c3d2561f7e9f0d0effc77d5f45d9842dbb"} Jun 13 05:31:18 crc kubenswrapper[4894]: I0613 05:31:18.886190 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" event={"ID":"8f396ee0-4caf-4c2c-a060-e46767e338c9","Type":"ContainerStarted","Data":"0e0c2d6a3fe7df0614bbe1f0717c270655dfaae687ab3d4391bd335c68230aa3"} Jun 13 05:31:18 crc kubenswrapper[4894]: I0613 05:31:18.903491 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" podStartSLOduration=2.49328991 podStartE2EDuration="2.903475128s" podCreationTimestamp="2025-06-13 05:31:16 +0000 UTC" firstStartedPulling="2025-06-13 05:31:18.103089448 +0000 UTC m=+2436.549336921" lastFinishedPulling="2025-06-13 05:31:18.513274656 +0000 UTC m=+2436.959522139" observedRunningTime="2025-06-13 05:31:18.902225882 +0000 UTC m=+2437.348473345" watchObservedRunningTime="2025-06-13 05:31:18.903475128 +0000 UTC m=+2437.349722591" Jun 13 05:31:24 crc kubenswrapper[4894]: I0613 05:31:24.277264 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:31:24 crc kubenswrapper[4894]: E0613 05:31:24.278011 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:31:38 crc kubenswrapper[4894]: I0613 05:31:38.276924 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:31:38 crc kubenswrapper[4894]: E0613 05:31:38.277802 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:31:43 crc kubenswrapper[4894]: I0613 05:31:43.133601 4894 generic.go:334] "Generic (PLEG): container finished" podID="8f396ee0-4caf-4c2c-a060-e46767e338c9" containerID="cecba0cb06d8d349b3ade174a123c6c3d2561f7e9f0d0effc77d5f45d9842dbb" exitCode=0 Jun 13 05:31:43 crc kubenswrapper[4894]: I0613 05:31:43.133682 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" event={"ID":"8f396ee0-4caf-4c2c-a060-e46767e338c9","Type":"ContainerDied","Data":"cecba0cb06d8d349b3ade174a123c6c3d2561f7e9f0d0effc77d5f45d9842dbb"} Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.593038 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.692417 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx9n2\" (UniqueName: \"kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2\") pod \"8f396ee0-4caf-4c2c-a060-e46767e338c9\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.692454 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph\") pod \"8f396ee0-4caf-4c2c-a060-e46767e338c9\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.692495 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory\") pod \"8f396ee0-4caf-4c2c-a060-e46767e338c9\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.692530 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key\") pod \"8f396ee0-4caf-4c2c-a060-e46767e338c9\" (UID: \"8f396ee0-4caf-4c2c-a060-e46767e338c9\") " Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.699620 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph" (OuterVolumeSpecName: "ceph") pod "8f396ee0-4caf-4c2c-a060-e46767e338c9" (UID: "8f396ee0-4caf-4c2c-a060-e46767e338c9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.699842 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2" (OuterVolumeSpecName: "kube-api-access-xx9n2") pod "8f396ee0-4caf-4c2c-a060-e46767e338c9" (UID: "8f396ee0-4caf-4c2c-a060-e46767e338c9"). InnerVolumeSpecName "kube-api-access-xx9n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.717304 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory" (OuterVolumeSpecName: "inventory") pod "8f396ee0-4caf-4c2c-a060-e46767e338c9" (UID: "8f396ee0-4caf-4c2c-a060-e46767e338c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.725165 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f396ee0-4caf-4c2c-a060-e46767e338c9" (UID: "8f396ee0-4caf-4c2c-a060-e46767e338c9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.794846 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx9n2\" (UniqueName: \"kubernetes.io/projected/8f396ee0-4caf-4c2c-a060-e46767e338c9-kube-api-access-xx9n2\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.794890 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.794911 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:44 crc kubenswrapper[4894]: I0613 05:31:44.794928 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f396ee0-4caf-4c2c-a060-e46767e338c9-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.160377 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" event={"ID":"8f396ee0-4caf-4c2c-a060-e46767e338c9","Type":"ContainerDied","Data":"0e0c2d6a3fe7df0614bbe1f0717c270655dfaae687ab3d4391bd335c68230aa3"} Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.160437 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0c2d6a3fe7df0614bbe1f0717c270655dfaae687ab3d4391bd335c68230aa3" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.160527 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-v454f" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.290156 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g"] Jun 13 05:31:45 crc kubenswrapper[4894]: E0613 05:31:45.290563 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f396ee0-4caf-4c2c-a060-e46767e338c9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.290578 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f396ee0-4caf-4c2c-a060-e46767e338c9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.290790 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f396ee0-4caf-4c2c-a060-e46767e338c9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.291546 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.297036 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.297674 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.297855 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.297964 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.298091 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.298786 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g"] Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.405093 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.405169 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.405252 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.405341 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdtbm\" (UniqueName: \"kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.507184 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.507269 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdtbm\" (UniqueName: \"kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.507311 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.507346 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.511403 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.512638 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.517293 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.534947 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdtbm\" (UniqueName: \"kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:45 crc kubenswrapper[4894]: I0613 05:31:45.611057 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:46 crc kubenswrapper[4894]: I0613 05:31:46.191363 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g"] Jun 13 05:31:47 crc kubenswrapper[4894]: I0613 05:31:47.183359 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" event={"ID":"40081dfe-b604-4593-a435-3310168c3c31","Type":"ContainerStarted","Data":"be83fdc88aa65569d013d58a02b6535e5efb2d68c7d9b9873026fd8fb7dcec07"} Jun 13 05:31:47 crc kubenswrapper[4894]: I0613 05:31:47.183939 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" event={"ID":"40081dfe-b604-4593-a435-3310168c3c31","Type":"ContainerStarted","Data":"b20ad05dc8c1c377f67f374e6f3012ec511e9434784d0124a8dcdde7eeba76e0"} Jun 13 05:31:47 crc kubenswrapper[4894]: I0613 05:31:47.212365 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" podStartSLOduration=1.785435559 podStartE2EDuration="2.21233996s" podCreationTimestamp="2025-06-13 05:31:45 +0000 UTC" firstStartedPulling="2025-06-13 05:31:46.193979296 +0000 UTC m=+2464.640226799" lastFinishedPulling="2025-06-13 05:31:46.620883707 +0000 UTC m=+2465.067131200" observedRunningTime="2025-06-13 05:31:47.201978196 +0000 UTC m=+2465.648225699" watchObservedRunningTime="2025-06-13 05:31:47.21233996 +0000 UTC m=+2465.658587453" Jun 13 05:31:53 crc kubenswrapper[4894]: I0613 05:31:53.276808 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:31:53 crc kubenswrapper[4894]: E0613 05:31:53.277539 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:31:56 crc kubenswrapper[4894]: I0613 05:31:56.278964 4894 generic.go:334] "Generic (PLEG): container finished" podID="40081dfe-b604-4593-a435-3310168c3c31" containerID="be83fdc88aa65569d013d58a02b6535e5efb2d68c7d9b9873026fd8fb7dcec07" exitCode=0 Jun 13 05:31:56 crc kubenswrapper[4894]: I0613 05:31:56.292885 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" event={"ID":"40081dfe-b604-4593-a435-3310168c3c31","Type":"ContainerDied","Data":"be83fdc88aa65569d013d58a02b6535e5efb2d68c7d9b9873026fd8fb7dcec07"} Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.749212 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.875079 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph\") pod \"40081dfe-b604-4593-a435-3310168c3c31\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.875155 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdtbm\" (UniqueName: \"kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm\") pod \"40081dfe-b604-4593-a435-3310168c3c31\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.875355 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory\") pod \"40081dfe-b604-4593-a435-3310168c3c31\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.875415 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key\") pod \"40081dfe-b604-4593-a435-3310168c3c31\" (UID: \"40081dfe-b604-4593-a435-3310168c3c31\") " Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.881247 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm" (OuterVolumeSpecName: "kube-api-access-rdtbm") pod "40081dfe-b604-4593-a435-3310168c3c31" (UID: "40081dfe-b604-4593-a435-3310168c3c31"). InnerVolumeSpecName "kube-api-access-rdtbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.883951 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph" (OuterVolumeSpecName: "ceph") pod "40081dfe-b604-4593-a435-3310168c3c31" (UID: "40081dfe-b604-4593-a435-3310168c3c31"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.916522 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory" (OuterVolumeSpecName: "inventory") pod "40081dfe-b604-4593-a435-3310168c3c31" (UID: "40081dfe-b604-4593-a435-3310168c3c31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.916840 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40081dfe-b604-4593-a435-3310168c3c31" (UID: "40081dfe-b604-4593-a435-3310168c3c31"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.977802 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.977834 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.977843 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40081dfe-b604-4593-a435-3310168c3c31-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:57 crc kubenswrapper[4894]: I0613 05:31:57.977852 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdtbm\" (UniqueName: \"kubernetes.io/projected/40081dfe-b604-4593-a435-3310168c3c31-kube-api-access-rdtbm\") on node \"crc\" DevicePath \"\"" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.309074 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" event={"ID":"40081dfe-b604-4593-a435-3310168c3c31","Type":"ContainerDied","Data":"b20ad05dc8c1c377f67f374e6f3012ec511e9434784d0124a8dcdde7eeba76e0"} Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.309134 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b20ad05dc8c1c377f67f374e6f3012ec511e9434784d0124a8dcdde7eeba76e0" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.309218 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.427883 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g"] Jun 13 05:31:58 crc kubenswrapper[4894]: E0613 05:31:58.428289 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40081dfe-b604-4593-a435-3310168c3c31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.428309 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="40081dfe-b604-4593-a435-3310168c3c31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.428525 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="40081dfe-b604-4593-a435-3310168c3c31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.429739 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436290 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436538 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436677 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436828 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436968 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.436995 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.437076 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.437158 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.453137 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g"] Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.588951 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589010 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589042 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589106 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589142 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589178 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwg8\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589223 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589350 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589512 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589557 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589606 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589698 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.589735 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691512 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691868 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwg8\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691913 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691935 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691976 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.691997 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692023 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692051 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692069 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692102 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692119 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692139 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.692172 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.697369 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.697816 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.698768 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.699793 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.700083 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.705854 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.706061 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.708249 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.709315 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.709328 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.710831 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.713075 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.716424 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwg8\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c647g\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:58 crc kubenswrapper[4894]: I0613 05:31:58.748005 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:31:59 crc kubenswrapper[4894]: I0613 05:31:59.066982 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g"] Jun 13 05:31:59 crc kubenswrapper[4894]: I0613 05:31:59.319932 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" event={"ID":"130edae6-6f39-446a-8add-df5ce866f925","Type":"ContainerStarted","Data":"91fcd8bb2eaff04cd9909a724485743aeced2bc55eacd909700203cdb935d0c0"} Jun 13 05:32:00 crc kubenswrapper[4894]: I0613 05:32:00.329491 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" event={"ID":"130edae6-6f39-446a-8add-df5ce866f925","Type":"ContainerStarted","Data":"cdd12cd8773b056681089bd581217e1f98de8e7d242dcabec3fb1bc88b0d9c87"} Jun 13 05:32:00 crc kubenswrapper[4894]: I0613 05:32:00.378091 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" podStartSLOduration=1.8639034190000001 podStartE2EDuration="2.378069972s" podCreationTimestamp="2025-06-13 05:31:58 +0000 UTC" firstStartedPulling="2025-06-13 05:31:59.08607185 +0000 UTC m=+2477.532319313" lastFinishedPulling="2025-06-13 05:31:59.600238363 +0000 UTC m=+2478.046485866" observedRunningTime="2025-06-13 05:32:00.358432176 +0000 UTC m=+2478.804679679" watchObservedRunningTime="2025-06-13 05:32:00.378069972 +0000 UTC m=+2478.824317445" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.409385 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-j5rv5"] Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.411874 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.413634 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.474318 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.474460 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mttv9\" (UniqueName: \"kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.577414 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mttv9\" (UniqueName: \"kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.577612 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.577905 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.613223 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mttv9\" (UniqueName: \"kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9\") pod \"crc-debug-j5rv5\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: I0613 05:32:02.744332 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-j5rv5" Jun 13 05:32:02 crc kubenswrapper[4894]: W0613 05:32:02.797940 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82026850_669a_4ddd_afd4_ab65d0bc3f3e.slice/crio-669a4a77c2fcc371171993a39f779f58f2c72e932666729fbf0f10506bda2f69 WatchSource:0}: Error finding container 669a4a77c2fcc371171993a39f779f58f2c72e932666729fbf0f10506bda2f69: Status 404 returned error can't find the container with id 669a4a77c2fcc371171993a39f779f58f2c72e932666729fbf0f10506bda2f69 Jun 13 05:32:03 crc kubenswrapper[4894]: I0613 05:32:03.353808 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-j5rv5" event={"ID":"82026850-669a-4ddd-afd4-ab65d0bc3f3e","Type":"ContainerStarted","Data":"3c0ec4a32810875576994cf8b5c31c905c513b73b099473fef0b53e964623ca0"} Jun 13 05:32:03 crc kubenswrapper[4894]: I0613 05:32:03.354187 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-j5rv5" event={"ID":"82026850-669a-4ddd-afd4-ab65d0bc3f3e","Type":"ContainerStarted","Data":"669a4a77c2fcc371171993a39f779f58f2c72e932666729fbf0f10506bda2f69"} Jun 13 05:32:03 crc kubenswrapper[4894]: I0613 05:32:03.376729 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-j5rv5" podStartSLOduration=1.376701893 podStartE2EDuration="1.376701893s" podCreationTimestamp="2025-06-13 05:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:32:03.36602817 +0000 UTC m=+2481.812275673" watchObservedRunningTime="2025-06-13 05:32:03.376701893 +0000 UTC m=+2481.822949386" Jun 13 05:32:05 crc kubenswrapper[4894]: I0613 05:32:05.276536 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:32:06 crc kubenswrapper[4894]: I0613 05:32:06.386448 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0"} Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.409406 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-j5rv5"] Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.410194 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-j5rv5" podUID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" containerName="container-00" containerID="cri-o://3c0ec4a32810875576994cf8b5c31c905c513b73b099473fef0b53e964623ca0" gracePeriod=2 Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.423214 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-j5rv5"] Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.449302 4894 generic.go:334] "Generic (PLEG): container finished" podID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" containerID="3c0ec4a32810875576994cf8b5c31c905c513b73b099473fef0b53e964623ca0" exitCode=0 Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.449575 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669a4a77c2fcc371171993a39f779f58f2c72e932666729fbf0f10506bda2f69" Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.503002 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-j5rv5" Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.591299 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mttv9\" (UniqueName: \"kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9\") pod \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.591456 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host\") pod \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\" (UID: \"82026850-669a-4ddd-afd4-ab65d0bc3f3e\") " Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.591729 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host" (OuterVolumeSpecName: "host") pod "82026850-669a-4ddd-afd4-ab65d0bc3f3e" (UID: "82026850-669a-4ddd-afd4-ab65d0bc3f3e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.596597 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9" (OuterVolumeSpecName: "kube-api-access-mttv9") pod "82026850-669a-4ddd-afd4-ab65d0bc3f3e" (UID: "82026850-669a-4ddd-afd4-ab65d0bc3f3e"). InnerVolumeSpecName "kube-api-access-mttv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.693719 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mttv9\" (UniqueName: \"kubernetes.io/projected/82026850-669a-4ddd-afd4-ab65d0bc3f3e-kube-api-access-mttv9\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:13 crc kubenswrapper[4894]: I0613 05:32:13.693750 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82026850-669a-4ddd-afd4-ab65d0bc3f3e-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:14 crc kubenswrapper[4894]: I0613 05:32:14.289071 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" path="/var/lib/kubelet/pods/82026850-669a-4ddd-afd4-ab65d0bc3f3e/volumes" Jun 13 05:32:14 crc kubenswrapper[4894]: I0613 05:32:14.457436 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-j5rv5" Jun 13 05:32:35 crc kubenswrapper[4894]: I0613 05:32:35.683287 4894 generic.go:334] "Generic (PLEG): container finished" podID="130edae6-6f39-446a-8add-df5ce866f925" containerID="cdd12cd8773b056681089bd581217e1f98de8e7d242dcabec3fb1bc88b0d9c87" exitCode=0 Jun 13 05:32:35 crc kubenswrapper[4894]: I0613 05:32:35.683414 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" event={"ID":"130edae6-6f39-446a-8add-df5ce866f925","Type":"ContainerDied","Data":"cdd12cd8773b056681089bd581217e1f98de8e7d242dcabec3fb1bc88b0d9c87"} Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.137512 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282246 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282281 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282317 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282369 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282387 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282475 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282499 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwg8\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282526 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282540 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282581 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282610 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282710 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.282750 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle\") pod \"130edae6-6f39-446a-8add-df5ce866f925\" (UID: \"130edae6-6f39-446a-8add-df5ce866f925\") " Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.292753 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.292854 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.292882 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.294682 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8" (OuterVolumeSpecName: "kube-api-access-mlwg8") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "kube-api-access-mlwg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.295458 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.295479 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.296174 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.297806 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph" (OuterVolumeSpecName: "ceph") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.301819 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.303159 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.310794 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.313874 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory" (OuterVolumeSpecName: "inventory") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.322347 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "130edae6-6f39-446a-8add-df5ce866f925" (UID: "130edae6-6f39-446a-8add-df5ce866f925"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385015 4894 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385429 4894 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385524 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385623 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385730 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385862 4894 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.385952 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwg8\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-kube-api-access-mlwg8\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386071 4894 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386182 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386285 4894 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386381 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386473 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/130edae6-6f39-446a-8add-df5ce866f925-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.386552 4894 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130edae6-6f39-446a-8add-df5ce866f925-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.708506 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" event={"ID":"130edae6-6f39-446a-8add-df5ce866f925","Type":"ContainerDied","Data":"91fcd8bb2eaff04cd9909a724485743aeced2bc55eacd909700203cdb935d0c0"} Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.708567 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fcd8bb2eaff04cd9909a724485743aeced2bc55eacd909700203cdb935d0c0" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.708574 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c647g" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.818563 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd"] Jun 13 05:32:37 crc kubenswrapper[4894]: E0613 05:32:37.819102 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" containerName="container-00" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.819126 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" containerName="container-00" Jun 13 05:32:37 crc kubenswrapper[4894]: E0613 05:32:37.819137 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130edae6-6f39-446a-8add-df5ce866f925" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.819156 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="130edae6-6f39-446a-8add-df5ce866f925" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.819404 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="130edae6-6f39-446a-8add-df5ce866f925" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.819422 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="82026850-669a-4ddd-afd4-ab65d0bc3f3e" containerName="container-00" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.820184 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.825580 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.825728 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.826041 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.827687 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.828600 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.847762 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd"] Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.998279 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.998606 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.998733 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrwkt\" (UniqueName: \"kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:37 crc kubenswrapper[4894]: I0613 05:32:37.998830 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.100688 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrwkt\" (UniqueName: \"kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.100942 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.101192 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.101329 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.106437 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.107848 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.109777 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.118843 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrwkt\" (UniqueName: \"kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.136444 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.662160 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd"] Jun 13 05:32:38 crc kubenswrapper[4894]: W0613 05:32:38.672080 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a854efd_8626_4a9f_beec_678b2916fb09.slice/crio-a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e WatchSource:0}: Error finding container a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e: Status 404 returned error can't find the container with id a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e Jun 13 05:32:38 crc kubenswrapper[4894]: I0613 05:32:38.718089 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" event={"ID":"8a854efd-8626-4a9f-beec-678b2916fb09","Type":"ContainerStarted","Data":"a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e"} Jun 13 05:32:39 crc kubenswrapper[4894]: I0613 05:32:39.729766 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" event={"ID":"8a854efd-8626-4a9f-beec-678b2916fb09","Type":"ContainerStarted","Data":"daee073a04d28459122769e9c6bd56a4105ff78fb830b6037a6f884a51e7b07d"} Jun 13 05:32:39 crc kubenswrapper[4894]: I0613 05:32:39.769203 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" podStartSLOduration=2.315596309 podStartE2EDuration="2.769181265s" podCreationTimestamp="2025-06-13 05:32:37 +0000 UTC" firstStartedPulling="2025-06-13 05:32:38.674144571 +0000 UTC m=+2517.120392034" lastFinishedPulling="2025-06-13 05:32:39.127729527 +0000 UTC m=+2517.573976990" observedRunningTime="2025-06-13 05:32:39.757116033 +0000 UTC m=+2518.203363496" watchObservedRunningTime="2025-06-13 05:32:39.769181265 +0000 UTC m=+2518.215428748" Jun 13 05:32:46 crc kubenswrapper[4894]: I0613 05:32:46.797598 4894 generic.go:334] "Generic (PLEG): container finished" podID="8a854efd-8626-4a9f-beec-678b2916fb09" containerID="daee073a04d28459122769e9c6bd56a4105ff78fb830b6037a6f884a51e7b07d" exitCode=0 Jun 13 05:32:46 crc kubenswrapper[4894]: I0613 05:32:46.798256 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" event={"ID":"8a854efd-8626-4a9f-beec-678b2916fb09","Type":"ContainerDied","Data":"daee073a04d28459122769e9c6bd56a4105ff78fb830b6037a6f884a51e7b07d"} Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.420793 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.498629 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory\") pod \"8a854efd-8626-4a9f-beec-678b2916fb09\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.498763 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph\") pod \"8a854efd-8626-4a9f-beec-678b2916fb09\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.498820 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrwkt\" (UniqueName: \"kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt\") pod \"8a854efd-8626-4a9f-beec-678b2916fb09\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.498929 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key\") pod \"8a854efd-8626-4a9f-beec-678b2916fb09\" (UID: \"8a854efd-8626-4a9f-beec-678b2916fb09\") " Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.511793 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph" (OuterVolumeSpecName: "ceph") pod "8a854efd-8626-4a9f-beec-678b2916fb09" (UID: "8a854efd-8626-4a9f-beec-678b2916fb09"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.511895 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt" (OuterVolumeSpecName: "kube-api-access-wrwkt") pod "8a854efd-8626-4a9f-beec-678b2916fb09" (UID: "8a854efd-8626-4a9f-beec-678b2916fb09"). InnerVolumeSpecName "kube-api-access-wrwkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.525278 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory" (OuterVolumeSpecName: "inventory") pod "8a854efd-8626-4a9f-beec-678b2916fb09" (UID: "8a854efd-8626-4a9f-beec-678b2916fb09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.526417 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a854efd-8626-4a9f-beec-678b2916fb09" (UID: "8a854efd-8626-4a9f-beec-678b2916fb09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.600696 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.600729 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.600741 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrwkt\" (UniqueName: \"kubernetes.io/projected/8a854efd-8626-4a9f-beec-678b2916fb09-kube-api-access-wrwkt\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.600750 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a854efd-8626-4a9f-beec-678b2916fb09-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.821854 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" event={"ID":"8a854efd-8626-4a9f-beec-678b2916fb09","Type":"ContainerDied","Data":"a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e"} Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.821900 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.821915 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fee9346adece1be11d58d01734d2af457c5bf01a7d9cb706c6b6015d1bf62e" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.921833 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl"] Jun 13 05:32:48 crc kubenswrapper[4894]: E0613 05:32:48.922186 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a854efd-8626-4a9f-beec-678b2916fb09" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.922203 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a854efd-8626-4a9f-beec-678b2916fb09" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.922347 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a854efd-8626-4a9f-beec-678b2916fb09" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.922927 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.926635 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.926805 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.926937 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.937131 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl"] Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.937491 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.937729 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:32:48 crc kubenswrapper[4894]: I0613 05:32:48.942916 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007528 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtj7q\" (UniqueName: \"kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007583 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007645 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007677 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007709 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.007759 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.109749 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.109843 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.109889 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtj7q\" (UniqueName: \"kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.109918 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.109995 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.110010 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.110701 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.113942 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.114619 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.116154 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.116154 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.134543 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtj7q\" (UniqueName: \"kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzzpl\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.245541 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:32:49 crc kubenswrapper[4894]: I0613 05:32:49.861136 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl"] Jun 13 05:32:50 crc kubenswrapper[4894]: I0613 05:32:50.844135 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" event={"ID":"32106b20-1e0f-4cd7-bbd8-b092163a9035","Type":"ContainerStarted","Data":"03d05b9dbe7807ff623e231472cc51373984b112c3aedccbe489e48b1f01b68b"} Jun 13 05:32:50 crc kubenswrapper[4894]: I0613 05:32:50.844567 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" event={"ID":"32106b20-1e0f-4cd7-bbd8-b092163a9035","Type":"ContainerStarted","Data":"c21b990a3b1afb9dddd84cfa539f34e3e7e4b5b525ada8a121774ed14a165d7d"} Jun 13 05:32:50 crc kubenswrapper[4894]: I0613 05:32:50.875575 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" podStartSLOduration=2.399695543 podStartE2EDuration="2.87554815s" podCreationTimestamp="2025-06-13 05:32:48 +0000 UTC" firstStartedPulling="2025-06-13 05:32:49.85048757 +0000 UTC m=+2528.296735043" lastFinishedPulling="2025-06-13 05:32:50.326340147 +0000 UTC m=+2528.772587650" observedRunningTime="2025-06-13 05:32:50.867337398 +0000 UTC m=+2529.313584901" watchObservedRunningTime="2025-06-13 05:32:50.87554815 +0000 UTC m=+2529.321795643" Jun 13 05:33:01 crc kubenswrapper[4894]: I0613 05:33:01.858630 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-4kfwh"] Jun 13 05:33:01 crc kubenswrapper[4894]: I0613 05:33:01.860049 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4kfwh" Jun 13 05:33:01 crc kubenswrapper[4894]: I0613 05:33:01.863222 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:33:01 crc kubenswrapper[4894]: I0613 05:33:01.961946 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:01 crc kubenswrapper[4894]: I0613 05:33:01.962482 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlz24\" (UniqueName: \"kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.063836 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlz24\" (UniqueName: \"kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.063933 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.064129 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.093023 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlz24\" (UniqueName: \"kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24\") pod \"crc-debug-4kfwh\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.186463 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4kfwh" Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.956487 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4kfwh" event={"ID":"d851e78c-66a3-49cd-afa9-66bb0b8e8b46","Type":"ContainerStarted","Data":"c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f"} Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.956936 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4kfwh" event={"ID":"d851e78c-66a3-49cd-afa9-66bb0b8e8b46","Type":"ContainerStarted","Data":"0cbb1868a31efb030cd7f62fb148023efd4f506635e57ee2f9515385bac15b9e"} Jun 13 05:33:02 crc kubenswrapper[4894]: I0613 05:33:02.975934 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-4kfwh" podStartSLOduration=1.975916808 podStartE2EDuration="1.975916808s" podCreationTimestamp="2025-06-13 05:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:33:02.968093537 +0000 UTC m=+2541.414341010" watchObservedRunningTime="2025-06-13 05:33:02.975916808 +0000 UTC m=+2541.422164281" Jun 13 05:33:12 crc kubenswrapper[4894]: I0613 05:33:12.836693 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-4kfwh"] Jun 13 05:33:12 crc kubenswrapper[4894]: I0613 05:33:12.837514 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-4kfwh" podUID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" containerName="container-00" containerID="cri-o://c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f" gracePeriod=2 Jun 13 05:33:12 crc kubenswrapper[4894]: I0613 05:33:12.844298 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-4kfwh"] Jun 13 05:33:12 crc kubenswrapper[4894]: I0613 05:33:12.925074 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4kfwh" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.022960 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlz24\" (UniqueName: \"kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24\") pod \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.023186 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host\") pod \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\" (UID: \"d851e78c-66a3-49cd-afa9-66bb0b8e8b46\") " Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.023532 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host" (OuterVolumeSpecName: "host") pod "d851e78c-66a3-49cd-afa9-66bb0b8e8b46" (UID: "d851e78c-66a3-49cd-afa9-66bb0b8e8b46"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.040129 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24" (OuterVolumeSpecName: "kube-api-access-nlz24") pod "d851e78c-66a3-49cd-afa9-66bb0b8e8b46" (UID: "d851e78c-66a3-49cd-afa9-66bb0b8e8b46"). InnerVolumeSpecName "kube-api-access-nlz24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.063169 4894 generic.go:334] "Generic (PLEG): container finished" podID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" containerID="c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f" exitCode=0 Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.063224 4894 scope.go:117] "RemoveContainer" containerID="c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.063352 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4kfwh" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.110923 4894 scope.go:117] "RemoveContainer" containerID="c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f" Jun 13 05:33:13 crc kubenswrapper[4894]: E0613 05:33:13.111817 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f\": container with ID starting with c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f not found: ID does not exist" containerID="c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.111858 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f"} err="failed to get container status \"c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f\": rpc error: code = NotFound desc = could not find container \"c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f\": container with ID starting with c5e7838b9a810053c2c0bdb3b67bfcc8b5a405139e7e6dcdc0496e623d851c8f not found: ID does not exist" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.125531 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:33:13 crc kubenswrapper[4894]: I0613 05:33:13.125562 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlz24\" (UniqueName: \"kubernetes.io/projected/d851e78c-66a3-49cd-afa9-66bb0b8e8b46-kube-api-access-nlz24\") on node \"crc\" DevicePath \"\"" Jun 13 05:33:14 crc kubenswrapper[4894]: I0613 05:33:14.288459 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" path="/var/lib/kubelet/pods/d851e78c-66a3-49cd-afa9-66bb0b8e8b46/volumes" Jun 13 05:33:50 crc kubenswrapper[4894]: I0613 05:33:50.901453 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:33:50 crc kubenswrapper[4894]: E0613 05:33:50.902320 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" containerName="container-00" Jun 13 05:33:50 crc kubenswrapper[4894]: I0613 05:33:50.902335 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" containerName="container-00" Jun 13 05:33:50 crc kubenswrapper[4894]: I0613 05:33:50.902531 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d851e78c-66a3-49cd-afa9-66bb0b8e8b46" containerName="container-00" Jun 13 05:33:50 crc kubenswrapper[4894]: I0613 05:33:50.903835 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:50 crc kubenswrapper[4894]: I0613 05:33:50.922089 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.007077 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.007137 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmbl\" (UniqueName: \"kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.007164 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.109255 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.109318 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmbl\" (UniqueName: \"kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.109339 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.109821 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.109852 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.129220 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmbl\" (UniqueName: \"kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl\") pod \"redhat-operators-phncb\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.273354 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:33:51 crc kubenswrapper[4894]: I0613 05:33:51.736239 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:33:52 crc kubenswrapper[4894]: I0613 05:33:52.505689 4894 generic.go:334] "Generic (PLEG): container finished" podID="be573011-80e7-47e6-a881-973384b5e1ef" containerID="59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf" exitCode=0 Jun 13 05:33:52 crc kubenswrapper[4894]: I0613 05:33:52.505800 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerDied","Data":"59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf"} Jun 13 05:33:52 crc kubenswrapper[4894]: I0613 05:33:52.505993 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerStarted","Data":"ffe66c6285e1fc77beb43b595ad4c58411b5c20a554151c08e310dc414ddde46"} Jun 13 05:33:53 crc kubenswrapper[4894]: I0613 05:33:53.521074 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerStarted","Data":"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f"} Jun 13 05:33:56 crc kubenswrapper[4894]: I0613 05:33:56.558228 4894 generic.go:334] "Generic (PLEG): container finished" podID="be573011-80e7-47e6-a881-973384b5e1ef" containerID="a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f" exitCode=0 Jun 13 05:33:56 crc kubenswrapper[4894]: I0613 05:33:56.558302 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerDied","Data":"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f"} Jun 13 05:33:56 crc kubenswrapper[4894]: I0613 05:33:56.563446 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:33:57 crc kubenswrapper[4894]: I0613 05:33:57.572573 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerStarted","Data":"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4"} Jun 13 05:34:01 crc kubenswrapper[4894]: I0613 05:34:01.273747 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:01 crc kubenswrapper[4894]: I0613 05:34:01.274287 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.285155 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phncb" podStartSLOduration=7.799215766 podStartE2EDuration="12.285139764s" podCreationTimestamp="2025-06-13 05:33:50 +0000 UTC" firstStartedPulling="2025-06-13 05:33:52.509167301 +0000 UTC m=+2590.955414764" lastFinishedPulling="2025-06-13 05:33:56.995091259 +0000 UTC m=+2595.441338762" observedRunningTime="2025-06-13 05:33:57.618282076 +0000 UTC m=+2596.064529549" watchObservedRunningTime="2025-06-13 05:34:02.285139764 +0000 UTC m=+2600.731387227" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.293160 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-kxxwh"] Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.294128 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.297179 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.320715 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phncb" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="registry-server" probeResult="failure" output=< Jun 13 05:34:02 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 05:34:02 crc kubenswrapper[4894]: > Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.447900 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbhj\" (UniqueName: \"kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.448290 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.549639 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbhj\" (UniqueName: \"kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.549834 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.549973 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.582523 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbhj\" (UniqueName: \"kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj\") pod \"crc-debug-kxxwh\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " pod="openstack/crc-debug-kxxwh" Jun 13 05:34:02 crc kubenswrapper[4894]: I0613 05:34:02.614552 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kxxwh" Jun 13 05:34:03 crc kubenswrapper[4894]: I0613 05:34:03.631356 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kxxwh" event={"ID":"5f0da53e-15f0-41bf-ac59-013d2a958d01","Type":"ContainerStarted","Data":"dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848"} Jun 13 05:34:03 crc kubenswrapper[4894]: I0613 05:34:03.631792 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kxxwh" event={"ID":"5f0da53e-15f0-41bf-ac59-013d2a958d01","Type":"ContainerStarted","Data":"8063b43082ec5b636ad7ca92cf06d987b59b6b07473767df1894cf9d557a49d2"} Jun 13 05:34:03 crc kubenswrapper[4894]: I0613 05:34:03.656251 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-kxxwh" podStartSLOduration=1.6562250779999999 podStartE2EDuration="1.656225078s" podCreationTimestamp="2025-06-13 05:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:34:03.649301142 +0000 UTC m=+2602.095548645" watchObservedRunningTime="2025-06-13 05:34:03.656225078 +0000 UTC m=+2602.102472581" Jun 13 05:34:04 crc kubenswrapper[4894]: I0613 05:34:04.646815 4894 generic.go:334] "Generic (PLEG): container finished" podID="32106b20-1e0f-4cd7-bbd8-b092163a9035" containerID="03d05b9dbe7807ff623e231472cc51373984b112c3aedccbe489e48b1f01b68b" exitCode=0 Jun 13 05:34:04 crc kubenswrapper[4894]: I0613 05:34:04.646880 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" event={"ID":"32106b20-1e0f-4cd7-bbd8-b092163a9035","Type":"ContainerDied","Data":"03d05b9dbe7807ff623e231472cc51373984b112c3aedccbe489e48b1f01b68b"} Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.192028 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233119 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233229 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtj7q\" (UniqueName: \"kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233288 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233401 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233452 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.233583 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0\") pod \"32106b20-1e0f-4cd7-bbd8-b092163a9035\" (UID: \"32106b20-1e0f-4cd7-bbd8-b092163a9035\") " Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.238846 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q" (OuterVolumeSpecName: "kube-api-access-dtj7q") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "kube-api-access-dtj7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.239515 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph" (OuterVolumeSpecName: "ceph") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.251220 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.261605 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.270881 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory" (OuterVolumeSpecName: "inventory") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.271129 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "32106b20-1e0f-4cd7-bbd8-b092163a9035" (UID: "32106b20-1e0f-4cd7-bbd8-b092163a9035"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335770 4894 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335820 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335830 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtj7q\" (UniqueName: \"kubernetes.io/projected/32106b20-1e0f-4cd7-bbd8-b092163a9035-kube-api-access-dtj7q\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335840 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335849 4894 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.335858 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32106b20-1e0f-4cd7-bbd8-b092163a9035-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.672789 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" event={"ID":"32106b20-1e0f-4cd7-bbd8-b092163a9035","Type":"ContainerDied","Data":"c21b990a3b1afb9dddd84cfa539f34e3e7e4b5b525ada8a121774ed14a165d7d"} Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.672839 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21b990a3b1afb9dddd84cfa539f34e3e7e4b5b525ada8a121774ed14a165d7d" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.672889 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzzpl" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.796129 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx"] Jun 13 05:34:06 crc kubenswrapper[4894]: E0613 05:34:06.797072 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32106b20-1e0f-4cd7-bbd8-b092163a9035" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.797106 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="32106b20-1e0f-4cd7-bbd8-b092163a9035" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.797414 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="32106b20-1e0f-4cd7-bbd8-b092163a9035" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.799415 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.807417 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.808060 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx"] Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.816205 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.816345 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.816381 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.816572 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.817179 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.817481 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844676 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844740 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844765 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844787 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844832 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844859 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6hbd\" (UniqueName: \"kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.844897 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.946897 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.946956 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6hbd\" (UniqueName: \"kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.947006 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.947058 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.947112 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.947133 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.947152 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.953482 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.954104 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.954260 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.955698 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.958417 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.965164 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:06 crc kubenswrapper[4894]: I0613 05:34:06.972418 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6hbd\" (UniqueName: \"kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:07 crc kubenswrapper[4894]: I0613 05:34:07.124717 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:34:07 crc kubenswrapper[4894]: I0613 05:34:07.731126 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx"] Jun 13 05:34:07 crc kubenswrapper[4894]: W0613 05:34:07.737780 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31bca8ac_baaf_4837_96fd_6f0e556e7c53.slice/crio-c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2 WatchSource:0}: Error finding container c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2: Status 404 returned error can't find the container with id c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2 Jun 13 05:34:08 crc kubenswrapper[4894]: I0613 05:34:08.693837 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" event={"ID":"31bca8ac-baaf-4837-96fd-6f0e556e7c53","Type":"ContainerStarted","Data":"48566b97e133f416703546a25efe98804724c81781146dbe5ab12151360783bf"} Jun 13 05:34:08 crc kubenswrapper[4894]: I0613 05:34:08.694456 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" event={"ID":"31bca8ac-baaf-4837-96fd-6f0e556e7c53","Type":"ContainerStarted","Data":"c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2"} Jun 13 05:34:08 crc kubenswrapper[4894]: I0613 05:34:08.711630 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" podStartSLOduration=2.217912062 podStartE2EDuration="2.711615084s" podCreationTimestamp="2025-06-13 05:34:06 +0000 UTC" firstStartedPulling="2025-06-13 05:34:07.741899911 +0000 UTC m=+2606.188147374" lastFinishedPulling="2025-06-13 05:34:08.235602923 +0000 UTC m=+2606.681850396" observedRunningTime="2025-06-13 05:34:08.708793864 +0000 UTC m=+2607.155041367" watchObservedRunningTime="2025-06-13 05:34:08.711615084 +0000 UTC m=+2607.157862547" Jun 13 05:34:11 crc kubenswrapper[4894]: I0613 05:34:11.358078 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:11 crc kubenswrapper[4894]: I0613 05:34:11.426932 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:11 crc kubenswrapper[4894]: I0613 05:34:11.621305 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:34:12 crc kubenswrapper[4894]: I0613 05:34:12.737628 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phncb" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="registry-server" containerID="cri-o://118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4" gracePeriod=2 Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.195226 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-kxxwh"] Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.195833 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-kxxwh" podUID="5f0da53e-15f0-41bf-ac59-013d2a958d01" containerName="container-00" containerID="cri-o://dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848" gracePeriod=2 Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.205801 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-kxxwh"] Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.320375 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.324017 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kxxwh" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.414623 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content\") pod \"be573011-80e7-47e6-a881-973384b5e1ef\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.414719 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities\") pod \"be573011-80e7-47e6-a881-973384b5e1ef\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.414769 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbhj\" (UniqueName: \"kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj\") pod \"5f0da53e-15f0-41bf-ac59-013d2a958d01\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.414885 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmbl\" (UniqueName: \"kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl\") pod \"be573011-80e7-47e6-a881-973384b5e1ef\" (UID: \"be573011-80e7-47e6-a881-973384b5e1ef\") " Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.414942 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host\") pod \"5f0da53e-15f0-41bf-ac59-013d2a958d01\" (UID: \"5f0da53e-15f0-41bf-ac59-013d2a958d01\") " Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.415444 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities" (OuterVolumeSpecName: "utilities") pod "be573011-80e7-47e6-a881-973384b5e1ef" (UID: "be573011-80e7-47e6-a881-973384b5e1ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.416441 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host" (OuterVolumeSpecName: "host") pod "5f0da53e-15f0-41bf-ac59-013d2a958d01" (UID: "5f0da53e-15f0-41bf-ac59-013d2a958d01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.420478 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl" (OuterVolumeSpecName: "kube-api-access-mdmbl") pod "be573011-80e7-47e6-a881-973384b5e1ef" (UID: "be573011-80e7-47e6-a881-973384b5e1ef"). InnerVolumeSpecName "kube-api-access-mdmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.420840 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj" (OuterVolumeSpecName: "kube-api-access-7lbhj") pod "5f0da53e-15f0-41bf-ac59-013d2a958d01" (UID: "5f0da53e-15f0-41bf-ac59-013d2a958d01"). InnerVolumeSpecName "kube-api-access-7lbhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.498396 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be573011-80e7-47e6-a881-973384b5e1ef" (UID: "be573011-80e7-47e6-a881-973384b5e1ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.516750 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.516772 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be573011-80e7-47e6-a881-973384b5e1ef-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.516782 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lbhj\" (UniqueName: \"kubernetes.io/projected/5f0da53e-15f0-41bf-ac59-013d2a958d01-kube-api-access-7lbhj\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.516793 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdmbl\" (UniqueName: \"kubernetes.io/projected/be573011-80e7-47e6-a881-973384b5e1ef-kube-api-access-mdmbl\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.516803 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0da53e-15f0-41bf-ac59-013d2a958d01-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.752460 4894 generic.go:334] "Generic (PLEG): container finished" podID="be573011-80e7-47e6-a881-973384b5e1ef" containerID="118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4" exitCode=0 Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.752523 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phncb" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.752542 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerDied","Data":"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4"} Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.752615 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phncb" event={"ID":"be573011-80e7-47e6-a881-973384b5e1ef","Type":"ContainerDied","Data":"ffe66c6285e1fc77beb43b595ad4c58411b5c20a554151c08e310dc414ddde46"} Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.753124 4894 scope.go:117] "RemoveContainer" containerID="118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.755898 4894 generic.go:334] "Generic (PLEG): container finished" podID="5f0da53e-15f0-41bf-ac59-013d2a958d01" containerID="dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848" exitCode=0 Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.756005 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kxxwh" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.791191 4894 scope.go:117] "RemoveContainer" containerID="a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.816089 4894 scope.go:117] "RemoveContainer" containerID="59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.822638 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.829498 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phncb"] Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.838889 4894 scope.go:117] "RemoveContainer" containerID="118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4" Jun 13 05:34:13 crc kubenswrapper[4894]: E0613 05:34:13.839525 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4\": container with ID starting with 118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4 not found: ID does not exist" containerID="118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.839571 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4"} err="failed to get container status \"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4\": rpc error: code = NotFound desc = could not find container \"118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4\": container with ID starting with 118ce8417efedda5f5a6a396e6f144ffea185ce614b892bd097baaff175478d4 not found: ID does not exist" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.839627 4894 scope.go:117] "RemoveContainer" containerID="a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f" Jun 13 05:34:13 crc kubenswrapper[4894]: E0613 05:34:13.839968 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f\": container with ID starting with a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f not found: ID does not exist" containerID="a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.840091 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f"} err="failed to get container status \"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f\": rpc error: code = NotFound desc = could not find container \"a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f\": container with ID starting with a71b8bee3cda20660251df159558f4facc174b3c536932abb8dab3f31c1a5d9f not found: ID does not exist" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.840201 4894 scope.go:117] "RemoveContainer" containerID="59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf" Jun 13 05:34:13 crc kubenswrapper[4894]: E0613 05:34:13.840689 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf\": container with ID starting with 59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf not found: ID does not exist" containerID="59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.840742 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf"} err="failed to get container status \"59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf\": rpc error: code = NotFound desc = could not find container \"59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf\": container with ID starting with 59004c1a17f7d93bd783715ec6803fa8648d50f44ab398047bae677bbf2379cf not found: ID does not exist" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.840771 4894 scope.go:117] "RemoveContainer" containerID="dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.896634 4894 scope.go:117] "RemoveContainer" containerID="dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848" Jun 13 05:34:13 crc kubenswrapper[4894]: E0613 05:34:13.896916 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848\": container with ID starting with dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848 not found: ID does not exist" containerID="dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848" Jun 13 05:34:13 crc kubenswrapper[4894]: I0613 05:34:13.896964 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848"} err="failed to get container status \"dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848\": rpc error: code = NotFound desc = could not find container \"dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848\": container with ID starting with dd659f1e6f0d59cf4d2c2f4c1d396d2f219c955730546899912ff6412a8b5848 not found: ID does not exist" Jun 13 05:34:14 crc kubenswrapper[4894]: I0613 05:34:14.289744 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0da53e-15f0-41bf-ac59-013d2a958d01" path="/var/lib/kubelet/pods/5f0da53e-15f0-41bf-ac59-013d2a958d01/volumes" Jun 13 05:34:14 crc kubenswrapper[4894]: I0613 05:34:14.291030 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be573011-80e7-47e6-a881-973384b5e1ef" path="/var/lib/kubelet/pods/be573011-80e7-47e6-a881-973384b5e1ef/volumes" Jun 13 05:34:26 crc kubenswrapper[4894]: I0613 05:34:26.236695 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:34:26 crc kubenswrapper[4894]: I0613 05:34:26.237565 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:34:56 crc kubenswrapper[4894]: I0613 05:34:56.236194 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:34:56 crc kubenswrapper[4894]: I0613 05:34:56.238206 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.611864 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-grv5k"] Jun 13 05:35:01 crc kubenswrapper[4894]: E0613 05:35:01.612817 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="registry-server" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.612833 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="registry-server" Jun 13 05:35:01 crc kubenswrapper[4894]: E0613 05:35:01.612867 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="extract-content" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.612875 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="extract-content" Jun 13 05:35:01 crc kubenswrapper[4894]: E0613 05:35:01.612890 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="extract-utilities" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.612900 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="extract-utilities" Jun 13 05:35:01 crc kubenswrapper[4894]: E0613 05:35:01.612914 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0da53e-15f0-41bf-ac59-013d2a958d01" containerName="container-00" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.612922 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0da53e-15f0-41bf-ac59-013d2a958d01" containerName="container-00" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.613140 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="be573011-80e7-47e6-a881-973384b5e1ef" containerName="registry-server" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.613166 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0da53e-15f0-41bf-ac59-013d2a958d01" containerName="container-00" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.613993 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.618755 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.640642 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv56l\" (UniqueName: \"kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.640984 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.742417 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv56l\" (UniqueName: \"kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.742542 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.742678 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.771086 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv56l\" (UniqueName: \"kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l\") pod \"crc-debug-grv5k\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " pod="openstack/crc-debug-grv5k" Jun 13 05:35:01 crc kubenswrapper[4894]: I0613 05:35:01.931207 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-grv5k" Jun 13 05:35:02 crc kubenswrapper[4894]: I0613 05:35:02.287504 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-grv5k" event={"ID":"bfbd10d8-df10-4151-9907-7b57dbb5a55b","Type":"ContainerStarted","Data":"82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721"} Jun 13 05:35:02 crc kubenswrapper[4894]: I0613 05:35:02.287813 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-grv5k" event={"ID":"bfbd10d8-df10-4151-9907-7b57dbb5a55b","Type":"ContainerStarted","Data":"8d8dd8dda2bfa1d984830cfbc7f5fb09eb748ce1249fb81cb40066cae2871de6"} Jun 13 05:35:02 crc kubenswrapper[4894]: I0613 05:35:02.300226 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-grv5k" podStartSLOduration=1.300213719 podStartE2EDuration="1.300213719s" podCreationTimestamp="2025-06-13 05:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:35:02.298060008 +0000 UTC m=+2660.744307471" watchObservedRunningTime="2025-06-13 05:35:02.300213719 +0000 UTC m=+2660.746461172" Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.527804 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-grv5k"] Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.528993 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-grv5k" podUID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" containerName="container-00" containerID="cri-o://82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721" gracePeriod=2 Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.542636 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-grv5k"] Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.602628 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-grv5k" Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.790328 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host\") pod \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.790492 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host" (OuterVolumeSpecName: "host") pod "bfbd10d8-df10-4151-9907-7b57dbb5a55b" (UID: "bfbd10d8-df10-4151-9907-7b57dbb5a55b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.791227 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv56l\" (UniqueName: \"kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l\") pod \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\" (UID: \"bfbd10d8-df10-4151-9907-7b57dbb5a55b\") " Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.792169 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfbd10d8-df10-4151-9907-7b57dbb5a55b-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.799949 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l" (OuterVolumeSpecName: "kube-api-access-bv56l") pod "bfbd10d8-df10-4151-9907-7b57dbb5a55b" (UID: "bfbd10d8-df10-4151-9907-7b57dbb5a55b"). InnerVolumeSpecName "kube-api-access-bv56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:35:12 crc kubenswrapper[4894]: I0613 05:35:12.893914 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv56l\" (UniqueName: \"kubernetes.io/projected/bfbd10d8-df10-4151-9907-7b57dbb5a55b-kube-api-access-bv56l\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:13 crc kubenswrapper[4894]: I0613 05:35:13.438195 4894 generic.go:334] "Generic (PLEG): container finished" podID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" containerID="82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721" exitCode=0 Jun 13 05:35:13 crc kubenswrapper[4894]: I0613 05:35:13.438272 4894 scope.go:117] "RemoveContainer" containerID="82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721" Jun 13 05:35:13 crc kubenswrapper[4894]: I0613 05:35:13.438436 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-grv5k" Jun 13 05:35:13 crc kubenswrapper[4894]: I0613 05:35:13.473243 4894 scope.go:117] "RemoveContainer" containerID="82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721" Jun 13 05:35:13 crc kubenswrapper[4894]: E0613 05:35:13.473934 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721\": container with ID starting with 82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721 not found: ID does not exist" containerID="82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721" Jun 13 05:35:13 crc kubenswrapper[4894]: I0613 05:35:13.473993 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721"} err="failed to get container status \"82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721\": rpc error: code = NotFound desc = could not find container \"82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721\": container with ID starting with 82efe03535a755ca9df9b9e414c36a8aaf2fe65bbf333eb1fb62339e526f1721 not found: ID does not exist" Jun 13 05:35:14 crc kubenswrapper[4894]: I0613 05:35:14.174985 4894 scope.go:117] "RemoveContainer" containerID="dc7497e967dc1ac6beb6400c658ff764181e275a36a506c83f773e96ee6839b5" Jun 13 05:35:14 crc kubenswrapper[4894]: I0613 05:35:14.290957 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" path="/var/lib/kubelet/pods/bfbd10d8-df10-4151-9907-7b57dbb5a55b/volumes" Jun 13 05:35:14 crc kubenswrapper[4894]: I0613 05:35:14.452172 4894 generic.go:334] "Generic (PLEG): container finished" podID="31bca8ac-baaf-4837-96fd-6f0e556e7c53" containerID="48566b97e133f416703546a25efe98804724c81781146dbe5ab12151360783bf" exitCode=0 Jun 13 05:35:14 crc kubenswrapper[4894]: I0613 05:35:14.452238 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" event={"ID":"31bca8ac-baaf-4837-96fd-6f0e556e7c53","Type":"ContainerDied","Data":"48566b97e133f416703546a25efe98804724c81781146dbe5ab12151360783bf"} Jun 13 05:35:15 crc kubenswrapper[4894]: I0613 05:35:15.926136 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.061798 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6hbd\" (UniqueName: \"kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.061870 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.062015 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.062054 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.062476 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.062565 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.062606 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0\") pod \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\" (UID: \"31bca8ac-baaf-4837-96fd-6f0e556e7c53\") " Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.070217 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.070241 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd" (OuterVolumeSpecName: "kube-api-access-m6hbd") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "kube-api-access-m6hbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.071129 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph" (OuterVolumeSpecName: "ceph") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.090443 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory" (OuterVolumeSpecName: "inventory") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.100051 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.118524 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.127688 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "31bca8ac-baaf-4837-96fd-6f0e556e7c53" (UID: "31bca8ac-baaf-4837-96fd-6f0e556e7c53"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165350 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165432 4894 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165454 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165469 4894 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165510 4894 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165524 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6hbd\" (UniqueName: \"kubernetes.io/projected/31bca8ac-baaf-4837-96fd-6f0e556e7c53-kube-api-access-m6hbd\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.165535 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31bca8ac-baaf-4837-96fd-6f0e556e7c53-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.476221 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" event={"ID":"31bca8ac-baaf-4837-96fd-6f0e556e7c53","Type":"ContainerDied","Data":"c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2"} Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.476261 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.476277 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c33d88e8e59eded188f9bc1a2f2ea9f32d7604146a0167177465b60b4258afa2" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.599264 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564"] Jun 13 05:35:16 crc kubenswrapper[4894]: E0613 05:35:16.599815 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" containerName="container-00" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.599827 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" containerName="container-00" Jun 13 05:35:16 crc kubenswrapper[4894]: E0613 05:35:16.599841 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bca8ac-baaf-4837-96fd-6f0e556e7c53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.599849 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bca8ac-baaf-4837-96fd-6f0e556e7c53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.600007 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bca8ac-baaf-4837-96fd-6f0e556e7c53" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.600029 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbd10d8-df10-4151-9907-7b57dbb5a55b" containerName="container-00" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.600723 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.605049 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.605067 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.605374 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.605763 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.606761 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.607089 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.659882 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564"] Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777365 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777414 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777445 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cwzw\" (UniqueName: \"kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777641 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777778 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.777808 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.879789 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.880465 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.880569 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cwzw\" (UniqueName: \"kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.880677 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.880749 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.880771 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.884360 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.887079 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.887367 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.887394 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.888097 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.904423 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cwzw\" (UniqueName: \"kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6q564\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:16 crc kubenswrapper[4894]: I0613 05:35:16.940994 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:35:17 crc kubenswrapper[4894]: I0613 05:35:17.492312 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564"] Jun 13 05:35:18 crc kubenswrapper[4894]: I0613 05:35:18.501371 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" event={"ID":"3602457e-fe9b-47ab-9497-b0777af3f090","Type":"ContainerStarted","Data":"d835e94cdefaca12f8f989ef1ba9d05b3f0a85944c6afdc3e631b3b7778fcc2c"} Jun 13 05:35:18 crc kubenswrapper[4894]: I0613 05:35:18.501776 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" event={"ID":"3602457e-fe9b-47ab-9497-b0777af3f090","Type":"ContainerStarted","Data":"823496bebf18dec9ca509cced3255ba8ec8622ffe8b2b762d6f28cb86ab26c67"} Jun 13 05:35:18 crc kubenswrapper[4894]: I0613 05:35:18.526076 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" podStartSLOduration=2.03211726 podStartE2EDuration="2.52605443s" podCreationTimestamp="2025-06-13 05:35:16 +0000 UTC" firstStartedPulling="2025-06-13 05:35:17.504032554 +0000 UTC m=+2675.950280057" lastFinishedPulling="2025-06-13 05:35:17.997969754 +0000 UTC m=+2676.444217227" observedRunningTime="2025-06-13 05:35:18.522199881 +0000 UTC m=+2676.968447354" watchObservedRunningTime="2025-06-13 05:35:18.52605443 +0000 UTC m=+2676.972301903" Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.236583 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.237120 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.237178 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.237896 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.237951 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0" gracePeriod=600 Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.581783 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0" exitCode=0 Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.582043 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0"} Jun 13 05:35:26 crc kubenswrapper[4894]: I0613 05:35:26.582075 4894 scope.go:117] "RemoveContainer" containerID="ce07ffb753c5ed78a985f7c736be051ecf03480d1218069c0c83053465c2b58a" Jun 13 05:35:27 crc kubenswrapper[4894]: I0613 05:35:27.597041 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c"} Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.005570 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-b8mzp"] Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.008046 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.010907 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.079352 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9vw\" (UniqueName: \"kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.079428 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.181427 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9vw\" (UniqueName: \"kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.181816 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.181919 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.215538 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9vw\" (UniqueName: \"kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw\") pod \"crc-debug-b8mzp\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: I0613 05:36:02.350071 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b8mzp" Jun 13 05:36:02 crc kubenswrapper[4894]: W0613 05:36:02.386241 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34fef740_ee01_4537_acb5_d3ca92666416.slice/crio-7a456312775b7c35cea1e3b6213b7410e58518f8ed2e02a28dc5c5974878e621 WatchSource:0}: Error finding container 7a456312775b7c35cea1e3b6213b7410e58518f8ed2e02a28dc5c5974878e621: Status 404 returned error can't find the container with id 7a456312775b7c35cea1e3b6213b7410e58518f8ed2e02a28dc5c5974878e621 Jun 13 05:36:03 crc kubenswrapper[4894]: I0613 05:36:03.023472 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b8mzp" event={"ID":"34fef740-ee01-4537-acb5-d3ca92666416","Type":"ContainerStarted","Data":"4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1"} Jun 13 05:36:03 crc kubenswrapper[4894]: I0613 05:36:03.023937 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b8mzp" event={"ID":"34fef740-ee01-4537-acb5-d3ca92666416","Type":"ContainerStarted","Data":"7a456312775b7c35cea1e3b6213b7410e58518f8ed2e02a28dc5c5974878e621"} Jun 13 05:36:03 crc kubenswrapper[4894]: I0613 05:36:03.058394 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-b8mzp" podStartSLOduration=2.058369227 podStartE2EDuration="2.058369227s" podCreationTimestamp="2025-06-13 05:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:36:03.043172427 +0000 UTC m=+2721.489419930" watchObservedRunningTime="2025-06-13 05:36:03.058369227 +0000 UTC m=+2721.504616720" Jun 13 05:36:12 crc kubenswrapper[4894]: I0613 05:36:12.929944 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-b8mzp"] Jun 13 05:36:12 crc kubenswrapper[4894]: I0613 05:36:12.930606 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-b8mzp" podUID="34fef740-ee01-4537-acb5-d3ca92666416" containerName="container-00" containerID="cri-o://4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1" gracePeriod=2 Jun 13 05:36:12 crc kubenswrapper[4894]: I0613 05:36:12.943327 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-b8mzp"] Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.044105 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b8mzp" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.107335 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host\") pod \"34fef740-ee01-4537-acb5-d3ca92666416\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.107389 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn9vw\" (UniqueName: \"kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw\") pod \"34fef740-ee01-4537-acb5-d3ca92666416\" (UID: \"34fef740-ee01-4537-acb5-d3ca92666416\") " Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.107826 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host" (OuterVolumeSpecName: "host") pod "34fef740-ee01-4537-acb5-d3ca92666416" (UID: "34fef740-ee01-4537-acb5-d3ca92666416"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.115325 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw" (OuterVolumeSpecName: "kube-api-access-wn9vw") pod "34fef740-ee01-4537-acb5-d3ca92666416" (UID: "34fef740-ee01-4537-acb5-d3ca92666416"). InnerVolumeSpecName "kube-api-access-wn9vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.150175 4894 generic.go:334] "Generic (PLEG): container finished" podID="34fef740-ee01-4537-acb5-d3ca92666416" containerID="4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1" exitCode=0 Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.150234 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b8mzp" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.150254 4894 scope.go:117] "RemoveContainer" containerID="4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.191296 4894 scope.go:117] "RemoveContainer" containerID="4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1" Jun 13 05:36:13 crc kubenswrapper[4894]: E0613 05:36:13.192397 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1\": container with ID starting with 4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1 not found: ID does not exist" containerID="4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.192446 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1"} err="failed to get container status \"4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1\": rpc error: code = NotFound desc = could not find container \"4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1\": container with ID starting with 4c779c3c155d27b3903e474023ed459d2cf6b9477e4f607184e08c1e79d57ce1 not found: ID does not exist" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.209956 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34fef740-ee01-4537-acb5-d3ca92666416-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:36:13 crc kubenswrapper[4894]: I0613 05:36:13.209980 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn9vw\" (UniqueName: \"kubernetes.io/projected/34fef740-ee01-4537-acb5-d3ca92666416-kube-api-access-wn9vw\") on node \"crc\" DevicePath \"\"" Jun 13 05:36:14 crc kubenswrapper[4894]: I0613 05:36:14.297776 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34fef740-ee01-4537-acb5-d3ca92666416" path="/var/lib/kubelet/pods/34fef740-ee01-4537-acb5-d3ca92666416/volumes" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.389893 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-dpvn2"] Jun 13 05:37:02 crc kubenswrapper[4894]: E0613 05:37:02.390699 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fef740-ee01-4537-acb5-d3ca92666416" containerName="container-00" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.390715 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fef740-ee01-4537-acb5-d3ca92666416" containerName="container-00" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.390978 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fef740-ee01-4537-acb5-d3ca92666416" containerName="container-00" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.391680 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.394029 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.570206 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdn6\" (UniqueName: \"kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.570303 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.672128 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.672275 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdn6\" (UniqueName: \"kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.672410 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.704045 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdn6\" (UniqueName: \"kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6\") pod \"crc-debug-dpvn2\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " pod="openstack/crc-debug-dpvn2" Jun 13 05:37:02 crc kubenswrapper[4894]: I0613 05:37:02.717228 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-dpvn2" Jun 13 05:37:03 crc kubenswrapper[4894]: I0613 05:37:03.677922 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-dpvn2" event={"ID":"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0","Type":"ContainerStarted","Data":"4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0"} Jun 13 05:37:03 crc kubenswrapper[4894]: I0613 05:37:03.678274 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-dpvn2" event={"ID":"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0","Type":"ContainerStarted","Data":"7d1822fff4260a31121658a0119f67fcefa58ac84dbbce0d5482ac885377a77f"} Jun 13 05:37:03 crc kubenswrapper[4894]: I0613 05:37:03.701643 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-dpvn2" podStartSLOduration=1.701609486 podStartE2EDuration="1.701609486s" podCreationTimestamp="2025-06-13 05:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:37:03.695253986 +0000 UTC m=+2782.141501459" watchObservedRunningTime="2025-06-13 05:37:03.701609486 +0000 UTC m=+2782.147856999" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.352737 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-dpvn2"] Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.353773 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-dpvn2" podUID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" containerName="container-00" containerID="cri-o://4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0" gracePeriod=2 Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.363025 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-dpvn2"] Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.508601 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-dpvn2" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.607448 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host\") pod \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.607624 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host" (OuterVolumeSpecName: "host") pod "44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" (UID: "44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.608140 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsdn6\" (UniqueName: \"kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6\") pod \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\" (UID: \"44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0\") " Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.609091 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.616866 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6" (OuterVolumeSpecName: "kube-api-access-zsdn6") pod "44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" (UID: "44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0"). InnerVolumeSpecName "kube-api-access-zsdn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.711909 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsdn6\" (UniqueName: \"kubernetes.io/projected/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0-kube-api-access-zsdn6\") on node \"crc\" DevicePath \"\"" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.798033 4894 generic.go:334] "Generic (PLEG): container finished" podID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" containerID="4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0" exitCode=0 Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.798087 4894 scope.go:117] "RemoveContainer" containerID="4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.798114 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-dpvn2" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.852364 4894 scope.go:117] "RemoveContainer" containerID="4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0" Jun 13 05:37:13 crc kubenswrapper[4894]: E0613 05:37:13.854016 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0\": container with ID starting with 4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0 not found: ID does not exist" containerID="4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0" Jun 13 05:37:13 crc kubenswrapper[4894]: I0613 05:37:13.854086 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0"} err="failed to get container status \"4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0\": rpc error: code = NotFound desc = could not find container \"4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0\": container with ID starting with 4cdf23c469ff0bd64875a30db03647ab9542057545e010813648059de2fe76d0 not found: ID does not exist" Jun 13 05:37:14 crc kubenswrapper[4894]: I0613 05:37:14.294370 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" path="/var/lib/kubelet/pods/44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0/volumes" Jun 13 05:37:26 crc kubenswrapper[4894]: I0613 05:37:26.236744 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:37:26 crc kubenswrapper[4894]: I0613 05:37:26.237335 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:37:56 crc kubenswrapper[4894]: I0613 05:37:56.236147 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:37:56 crc kubenswrapper[4894]: I0613 05:37:56.236964 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.783260 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-8wfvp"] Jun 13 05:38:01 crc kubenswrapper[4894]: E0613 05:38:01.786236 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" containerName="container-00" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.786551 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" containerName="container-00" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.787086 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="44373ca3-bd9e-4c6c-b6d8-9e8c7430f5d0" containerName="container-00" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.788544 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8wfvp" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.800491 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.924856 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:01 crc kubenswrapper[4894]: I0613 05:38:01.925395 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tqgg\" (UniqueName: \"kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.027384 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tqgg\" (UniqueName: \"kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.027560 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.027747 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.061121 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tqgg\" (UniqueName: \"kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg\") pod \"crc-debug-8wfvp\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.122050 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8wfvp" Jun 13 05:38:02 crc kubenswrapper[4894]: W0613 05:38:02.183566 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4f0ebb_0285_499c_9e67_c7a292b4f548.slice/crio-2b7775b2d5b3f4af173c57428ed53c5cabbfc147cf0b720537b0f3d179e7445e WatchSource:0}: Error finding container 2b7775b2d5b3f4af173c57428ed53c5cabbfc147cf0b720537b0f3d179e7445e: Status 404 returned error can't find the container with id 2b7775b2d5b3f4af173c57428ed53c5cabbfc147cf0b720537b0f3d179e7445e Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.385322 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-8wfvp" event={"ID":"bd4f0ebb-0285-499c-9e67-c7a292b4f548","Type":"ContainerStarted","Data":"0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65"} Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.385641 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-8wfvp" event={"ID":"bd4f0ebb-0285-499c-9e67-c7a292b4f548","Type":"ContainerStarted","Data":"2b7775b2d5b3f4af173c57428ed53c5cabbfc147cf0b720537b0f3d179e7445e"} Jun 13 05:38:02 crc kubenswrapper[4894]: I0613 05:38:02.413355 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-8wfvp" podStartSLOduration=1.413328762 podStartE2EDuration="1.413328762s" podCreationTimestamp="2025-06-13 05:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:38:02.398908164 +0000 UTC m=+2840.845155637" watchObservedRunningTime="2025-06-13 05:38:02.413328762 +0000 UTC m=+2840.859576265" Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.698998 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-8wfvp"] Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.699983 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-8wfvp" podUID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" containerName="container-00" containerID="cri-o://0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65" gracePeriod=2 Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.711575 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-8wfvp"] Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.793177 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8wfvp" Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.954618 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host\") pod \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.954736 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tqgg\" (UniqueName: \"kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg\") pod \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\" (UID: \"bd4f0ebb-0285-499c-9e67-c7a292b4f548\") " Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.954752 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host" (OuterVolumeSpecName: "host") pod "bd4f0ebb-0285-499c-9e67-c7a292b4f548" (UID: "bd4f0ebb-0285-499c-9e67-c7a292b4f548"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.955260 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd4f0ebb-0285-499c-9e67-c7a292b4f548-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:38:12 crc kubenswrapper[4894]: I0613 05:38:12.964827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg" (OuterVolumeSpecName: "kube-api-access-6tqgg") pod "bd4f0ebb-0285-499c-9e67-c7a292b4f548" (UID: "bd4f0ebb-0285-499c-9e67-c7a292b4f548"). InnerVolumeSpecName "kube-api-access-6tqgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.057870 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tqgg\" (UniqueName: \"kubernetes.io/projected/bd4f0ebb-0285-499c-9e67-c7a292b4f548-kube-api-access-6tqgg\") on node \"crc\" DevicePath \"\"" Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.522238 4894 generic.go:334] "Generic (PLEG): container finished" podID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" containerID="0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65" exitCode=0 Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.522300 4894 scope.go:117] "RemoveContainer" containerID="0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65" Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.522299 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-8wfvp" Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.556584 4894 scope.go:117] "RemoveContainer" containerID="0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65" Jun 13 05:38:13 crc kubenswrapper[4894]: E0613 05:38:13.557405 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65\": container with ID starting with 0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65 not found: ID does not exist" containerID="0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65" Jun 13 05:38:13 crc kubenswrapper[4894]: I0613 05:38:13.557456 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65"} err="failed to get container status \"0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65\": rpc error: code = NotFound desc = could not find container \"0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65\": container with ID starting with 0cdc71065b2bc4d13ec15772950595315f2f0496cd59105c17836e080e013d65 not found: ID does not exist" Jun 13 05:38:14 crc kubenswrapper[4894]: I0613 05:38:14.286730 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" path="/var/lib/kubelet/pods/bd4f0ebb-0285-499c-9e67-c7a292b4f548/volumes" Jun 13 05:38:14 crc kubenswrapper[4894]: I0613 05:38:14.347538 4894 scope.go:117] "RemoveContainer" containerID="3c0ec4a32810875576994cf8b5c31c905c513b73b099473fef0b53e964623ca0" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.236572 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.237253 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.237302 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.237754 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.237806 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" gracePeriod=600 Jun 13 05:38:26 crc kubenswrapper[4894]: E0613 05:38:26.366604 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.682050 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" exitCode=0 Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.682381 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c"} Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.682415 4894 scope.go:117] "RemoveContainer" containerID="ed5ca333edd9f3cfc3a1858bddc8373e2c3bcc10cb2d7956c87cf6bcefcc3bd0" Jun 13 05:38:26 crc kubenswrapper[4894]: I0613 05:38:26.682953 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:38:26 crc kubenswrapper[4894]: E0613 05:38:26.683162 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:38:40 crc kubenswrapper[4894]: I0613 05:38:40.276492 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:38:40 crc kubenswrapper[4894]: E0613 05:38:40.277485 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:38:52 crc kubenswrapper[4894]: I0613 05:38:52.284583 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:38:52 crc kubenswrapper[4894]: E0613 05:38:52.285409 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.141017 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-djd8j"] Jun 13 05:39:02 crc kubenswrapper[4894]: E0613 05:39:02.141948 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" containerName="container-00" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.141964 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" containerName="container-00" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.142184 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4f0ebb-0285-499c-9e67-c7a292b4f548" containerName="container-00" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.142925 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.145976 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.254152 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.254508 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpts4\" (UniqueName: \"kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.355632 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.355787 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpts4\" (UniqueName: \"kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.356121 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.381453 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpts4\" (UniqueName: \"kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4\") pod \"crc-debug-djd8j\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " pod="openstack/crc-debug-djd8j" Jun 13 05:39:02 crc kubenswrapper[4894]: I0613 05:39:02.463549 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-djd8j" Jun 13 05:39:03 crc kubenswrapper[4894]: I0613 05:39:03.098067 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-djd8j" event={"ID":"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff","Type":"ContainerStarted","Data":"e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff"} Jun 13 05:39:03 crc kubenswrapper[4894]: I0613 05:39:03.098466 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-djd8j" event={"ID":"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff","Type":"ContainerStarted","Data":"b01118cd19598754c177b0eb93b2e5dabcfb232524781249f226065b0cbd2acd"} Jun 13 05:39:03 crc kubenswrapper[4894]: I0613 05:39:03.113331 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-djd8j" podStartSLOduration=1.113307815 podStartE2EDuration="1.113307815s" podCreationTimestamp="2025-06-13 05:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:39:03.107321956 +0000 UTC m=+2901.553569429" watchObservedRunningTime="2025-06-13 05:39:03.113307815 +0000 UTC m=+2901.559555318" Jun 13 05:39:04 crc kubenswrapper[4894]: I0613 05:39:04.276857 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:39:04 crc kubenswrapper[4894]: E0613 05:39:04.277532 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.073768 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-djd8j"] Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.074392 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-djd8j" podUID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" containerName="container-00" containerID="cri-o://e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff" gracePeriod=2 Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.082259 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-djd8j"] Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.145351 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-djd8j" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.182622 4894 generic.go:334] "Generic (PLEG): container finished" podID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" containerID="e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff" exitCode=0 Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.182736 4894 scope.go:117] "RemoveContainer" containerID="e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.182758 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-djd8j" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.200506 4894 scope.go:117] "RemoveContainer" containerID="e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff" Jun 13 05:39:13 crc kubenswrapper[4894]: E0613 05:39:13.200965 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff\": container with ID starting with e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff not found: ID does not exist" containerID="e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.201004 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff"} err="failed to get container status \"e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff\": rpc error: code = NotFound desc = could not find container \"e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff\": container with ID starting with e1edf3da1ee0df7ae30efbf66683c2a1bbf9cd668f601519fb6670b80a1bc4ff not found: ID does not exist" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.272478 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host\") pod \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.272601 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host" (OuterVolumeSpecName: "host") pod "11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" (UID: "11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.272620 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpts4\" (UniqueName: \"kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4\") pod \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\" (UID: \"11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff\") " Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.273078 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.280257 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4" (OuterVolumeSpecName: "kube-api-access-fpts4") pod "11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" (UID: "11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff"). InnerVolumeSpecName "kube-api-access-fpts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:39:13 crc kubenswrapper[4894]: I0613 05:39:13.374723 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpts4\" (UniqueName: \"kubernetes.io/projected/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff-kube-api-access-fpts4\") on node \"crc\" DevicePath \"\"" Jun 13 05:39:14 crc kubenswrapper[4894]: I0613 05:39:14.288563 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" path="/var/lib/kubelet/pods/11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff/volumes" Jun 13 05:39:17 crc kubenswrapper[4894]: I0613 05:39:17.276719 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:39:17 crc kubenswrapper[4894]: E0613 05:39:17.277192 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:39:32 crc kubenswrapper[4894]: I0613 05:39:32.287954 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:39:32 crc kubenswrapper[4894]: E0613 05:39:32.290133 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:39:46 crc kubenswrapper[4894]: I0613 05:39:46.277196 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:39:46 crc kubenswrapper[4894]: E0613 05:39:46.277968 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:39:59 crc kubenswrapper[4894]: I0613 05:39:59.277369 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:39:59 crc kubenswrapper[4894]: E0613 05:39:59.280448 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.471746 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-5zwh2"] Jun 13 05:40:01 crc kubenswrapper[4894]: E0613 05:40:01.472168 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" containerName="container-00" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.472189 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" containerName="container-00" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.472431 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a97d72-1a9b-4db3-9ec2-9fb4b54dfcff" containerName="container-00" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.473236 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.475069 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.589886 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.590366 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgtxs\" (UniqueName: \"kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.691631 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.691796 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgtxs\" (UniqueName: \"kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.691808 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.721335 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgtxs\" (UniqueName: \"kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs\") pod \"crc-debug-5zwh2\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " pod="openstack/crc-debug-5zwh2" Jun 13 05:40:01 crc kubenswrapper[4894]: I0613 05:40:01.810282 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5zwh2" Jun 13 05:40:02 crc kubenswrapper[4894]: I0613 05:40:02.706806 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5zwh2" event={"ID":"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75","Type":"ContainerStarted","Data":"4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402"} Jun 13 05:40:02 crc kubenswrapper[4894]: I0613 05:40:02.707204 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-5zwh2" event={"ID":"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75","Type":"ContainerStarted","Data":"1de62188b663a9cbcda2e03ec4b34c1d6ab26fe4e9bf237ebf8eee9ad0c12d65"} Jun 13 05:40:02 crc kubenswrapper[4894]: I0613 05:40:02.725257 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-5zwh2" podStartSLOduration=1.725221846 podStartE2EDuration="1.725221846s" podCreationTimestamp="2025-06-13 05:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:40:02.724942748 +0000 UTC m=+2961.171190251" watchObservedRunningTime="2025-06-13 05:40:02.725221846 +0000 UTC m=+2961.171469349" Jun 13 05:40:11 crc kubenswrapper[4894]: I0613 05:40:11.277804 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:40:11 crc kubenswrapper[4894]: E0613 05:40:11.278868 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.311266 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-5zwh2"] Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.311799 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-5zwh2" podUID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" containerName="container-00" containerID="cri-o://4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402" gracePeriod=2 Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.325441 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-5zwh2"] Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.390703 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5zwh2" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.530677 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host\") pod \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.530839 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgtxs\" (UniqueName: \"kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs\") pod \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\" (UID: \"1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75\") " Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.531859 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host" (OuterVolumeSpecName: "host") pod "1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" (UID: "1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.549596 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs" (OuterVolumeSpecName: "kube-api-access-wgtxs") pod "1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" (UID: "1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75"). InnerVolumeSpecName "kube-api-access-wgtxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.633383 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.633439 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgtxs\" (UniqueName: \"kubernetes.io/projected/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75-kube-api-access-wgtxs\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.824610 4894 generic.go:334] "Generic (PLEG): container finished" podID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" containerID="4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402" exitCode=0 Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.824761 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-5zwh2" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.825136 4894 scope.go:117] "RemoveContainer" containerID="4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.857592 4894 scope.go:117] "RemoveContainer" containerID="4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402" Jun 13 05:40:12 crc kubenswrapper[4894]: E0613 05:40:12.858255 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402\": container with ID starting with 4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402 not found: ID does not exist" containerID="4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402" Jun 13 05:40:12 crc kubenswrapper[4894]: I0613 05:40:12.858320 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402"} err="failed to get container status \"4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402\": rpc error: code = NotFound desc = could not find container \"4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402\": container with ID starting with 4c67b3daff5c3a03d50c20268b853d6ba40cc8c321850e80ae6f9f64705b6402 not found: ID does not exist" Jun 13 05:40:14 crc kubenswrapper[4894]: I0613 05:40:14.292323 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" path="/var/lib/kubelet/pods/1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75/volumes" Jun 13 05:40:21 crc kubenswrapper[4894]: I0613 05:40:21.947261 4894 generic.go:334] "Generic (PLEG): container finished" podID="3602457e-fe9b-47ab-9497-b0777af3f090" containerID="d835e94cdefaca12f8f989ef1ba9d05b3f0a85944c6afdc3e631b3b7778fcc2c" exitCode=0 Jun 13 05:40:21 crc kubenswrapper[4894]: I0613 05:40:21.947385 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" event={"ID":"3602457e-fe9b-47ab-9497-b0777af3f090","Type":"ContainerDied","Data":"d835e94cdefaca12f8f989ef1ba9d05b3f0a85944c6afdc3e631b3b7778fcc2c"} Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.366641 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421500 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421590 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421615 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421638 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421770 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.421800 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cwzw\" (UniqueName: \"kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw\") pod \"3602457e-fe9b-47ab-9497-b0777af3f090\" (UID: \"3602457e-fe9b-47ab-9497-b0777af3f090\") " Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.465641 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph" (OuterVolumeSpecName: "ceph") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.466252 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw" (OuterVolumeSpecName: "kube-api-access-6cwzw") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "kube-api-access-6cwzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.470093 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.471939 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory" (OuterVolumeSpecName: "inventory") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.474304 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.475960 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3602457e-fe9b-47ab-9497-b0777af3f090" (UID: "3602457e-fe9b-47ab-9497-b0777af3f090"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542162 4894 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542190 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cwzw\" (UniqueName: \"kubernetes.io/projected/3602457e-fe9b-47ab-9497-b0777af3f090-kube-api-access-6cwzw\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542202 4894 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542212 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542221 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.542230 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3602457e-fe9b-47ab-9497-b0777af3f090-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.974567 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" event={"ID":"3602457e-fe9b-47ab-9497-b0777af3f090","Type":"ContainerDied","Data":"823496bebf18dec9ca509cced3255ba8ec8622ffe8b2b762d6f28cb86ab26c67"} Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.974612 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823496bebf18dec9ca509cced3255ba8ec8622ffe8b2b762d6f28cb86ab26c67" Jun 13 05:40:23 crc kubenswrapper[4894]: I0613 05:40:23.974792 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6q564" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.117356 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8"] Jun 13 05:40:24 crc kubenswrapper[4894]: E0613 05:40:24.117862 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" containerName="container-00" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.117880 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" containerName="container-00" Jun 13 05:40:24 crc kubenswrapper[4894]: E0613 05:40:24.117896 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3602457e-fe9b-47ab-9497-b0777af3f090" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.117903 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3602457e-fe9b-47ab-9497-b0777af3f090" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.119167 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2b7d45-bdae-4f23-87f0-9f6dc8b05c75" containerName="container-00" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.119194 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3602457e-fe9b-47ab-9497-b0777af3f090" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.119969 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.122581 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fn2" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.124278 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.124387 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.125161 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.125277 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.125380 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.125473 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.125641 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8"] Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.130085 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.131256 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.275871 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.277825 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.277970 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278095 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278200 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278297 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278427 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278521 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.276601 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:40:24 crc kubenswrapper[4894]: E0613 05:40:24.278902 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.278624 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.279225 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.279330 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqr9\" (UniqueName: \"kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.381596 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqr9\" (UniqueName: \"kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.384226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.385377 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.385507 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.385882 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.386335 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.386398 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.386864 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.386923 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.386939 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.387040 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.387084 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.389096 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.390990 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.391836 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.393590 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.393780 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.393844 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.408856 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqr9\" (UniqueName: \"kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.409422 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.413245 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.415386 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:24 crc kubenswrapper[4894]: I0613 05:40:24.492437 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:40:25 crc kubenswrapper[4894]: I0613 05:40:25.016123 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8"] Jun 13 05:40:25 crc kubenswrapper[4894]: I0613 05:40:25.017806 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:40:25 crc kubenswrapper[4894]: I0613 05:40:25.991360 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" event={"ID":"17cfff45-e7b3-4297-9b08-33a8ea345bc2","Type":"ContainerStarted","Data":"93c447b6ec8f104075f3dd6dffb5aaa2a0e8012e16633b8b3b0a35d416bb837b"} Jun 13 05:40:25 crc kubenswrapper[4894]: I0613 05:40:25.992714 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" event={"ID":"17cfff45-e7b3-4297-9b08-33a8ea345bc2","Type":"ContainerStarted","Data":"0c3f35fb700b449b8496ab1dd1522b86da89191bce434b78cf6536ae3e6c4ab6"} Jun 13 05:40:26 crc kubenswrapper[4894]: I0613 05:40:26.018364 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" podStartSLOduration=1.5688812109999999 podStartE2EDuration="2.018339144s" podCreationTimestamp="2025-06-13 05:40:24 +0000 UTC" firstStartedPulling="2025-06-13 05:40:25.017440415 +0000 UTC m=+2983.463687888" lastFinishedPulling="2025-06-13 05:40:25.466898318 +0000 UTC m=+2983.913145821" observedRunningTime="2025-06-13 05:40:26.011931213 +0000 UTC m=+2984.458178686" watchObservedRunningTime="2025-06-13 05:40:26.018339144 +0000 UTC m=+2984.464586637" Jun 13 05:40:38 crc kubenswrapper[4894]: I0613 05:40:38.277155 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:40:38 crc kubenswrapper[4894]: E0613 05:40:38.278731 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:40:50 crc kubenswrapper[4894]: I0613 05:40:50.277533 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:40:50 crc kubenswrapper[4894]: E0613 05:40:50.280237 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:01 crc kubenswrapper[4894]: I0613 05:41:01.787386 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-xg6tg"] Jun 13 05:41:01 crc kubenswrapper[4894]: I0613 05:41:01.791811 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xg6tg" Jun 13 05:41:01 crc kubenswrapper[4894]: I0613 05:41:01.794711 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:41:01 crc kubenswrapper[4894]: I0613 05:41:01.966194 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbmn\" (UniqueName: \"kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:01 crc kubenswrapper[4894]: I0613 05:41:01.966834 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.068949 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbmn\" (UniqueName: \"kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.069048 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.069430 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.108647 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbmn\" (UniqueName: \"kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn\") pod \"crc-debug-xg6tg\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.124879 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xg6tg" Jun 13 05:41:02 crc kubenswrapper[4894]: I0613 05:41:02.358335 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xg6tg" event={"ID":"a61eb18b-42c2-4636-a680-2e31aa304d23","Type":"ContainerStarted","Data":"d130413e05ce9e2c5b2787e74708a5256c4fdf85a13331837b9e0b0d696c1ee5"} Jun 13 05:41:03 crc kubenswrapper[4894]: I0613 05:41:03.366354 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-xg6tg" event={"ID":"a61eb18b-42c2-4636-a680-2e31aa304d23","Type":"ContainerStarted","Data":"befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db"} Jun 13 05:41:03 crc kubenswrapper[4894]: I0613 05:41:03.399796 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-xg6tg" podStartSLOduration=2.399778521 podStartE2EDuration="2.399778521s" podCreationTimestamp="2025-06-13 05:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:41:03.394100761 +0000 UTC m=+3021.840348224" watchObservedRunningTime="2025-06-13 05:41:03.399778521 +0000 UTC m=+3021.846025974" Jun 13 05:41:04 crc kubenswrapper[4894]: I0613 05:41:04.277407 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:41:04 crc kubenswrapper[4894]: E0613 05:41:04.277860 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:12 crc kubenswrapper[4894]: I0613 05:41:12.737532 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-xg6tg"] Jun 13 05:41:12 crc kubenswrapper[4894]: I0613 05:41:12.739569 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-xg6tg" podUID="a61eb18b-42c2-4636-a680-2e31aa304d23" containerName="container-00" containerID="cri-o://befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db" gracePeriod=2 Jun 13 05:41:12 crc kubenswrapper[4894]: I0613 05:41:12.751744 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-xg6tg"] Jun 13 05:41:12 crc kubenswrapper[4894]: I0613 05:41:12.851688 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xg6tg" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.026870 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host\") pod \"a61eb18b-42c2-4636-a680-2e31aa304d23\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.027005 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host" (OuterVolumeSpecName: "host") pod "a61eb18b-42c2-4636-a680-2e31aa304d23" (UID: "a61eb18b-42c2-4636-a680-2e31aa304d23"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.028214 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgbmn\" (UniqueName: \"kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn\") pod \"a61eb18b-42c2-4636-a680-2e31aa304d23\" (UID: \"a61eb18b-42c2-4636-a680-2e31aa304d23\") " Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.029098 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a61eb18b-42c2-4636-a680-2e31aa304d23-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.037735 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn" (OuterVolumeSpecName: "kube-api-access-kgbmn") pod "a61eb18b-42c2-4636-a680-2e31aa304d23" (UID: "a61eb18b-42c2-4636-a680-2e31aa304d23"). InnerVolumeSpecName "kube-api-access-kgbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.131544 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgbmn\" (UniqueName: \"kubernetes.io/projected/a61eb18b-42c2-4636-a680-2e31aa304d23-kube-api-access-kgbmn\") on node \"crc\" DevicePath \"\"" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.471947 4894 generic.go:334] "Generic (PLEG): container finished" podID="a61eb18b-42c2-4636-a680-2e31aa304d23" containerID="befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db" exitCode=0 Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.472266 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-xg6tg" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.472198 4894 scope.go:117] "RemoveContainer" containerID="befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.501003 4894 scope.go:117] "RemoveContainer" containerID="befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db" Jun 13 05:41:13 crc kubenswrapper[4894]: E0613 05:41:13.502011 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db\": container with ID starting with befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db not found: ID does not exist" containerID="befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db" Jun 13 05:41:13 crc kubenswrapper[4894]: I0613 05:41:13.502061 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db"} err="failed to get container status \"befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db\": rpc error: code = NotFound desc = could not find container \"befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db\": container with ID starting with befffa33f2a7510e013df2e4fba8ee8676907707636416f7dc8d553ae07e36db not found: ID does not exist" Jun 13 05:41:14 crc kubenswrapper[4894]: I0613 05:41:14.292387 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61eb18b-42c2-4636-a680-2e31aa304d23" path="/var/lib/kubelet/pods/a61eb18b-42c2-4636-a680-2e31aa304d23/volumes" Jun 13 05:41:18 crc kubenswrapper[4894]: I0613 05:41:18.276573 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:41:18 crc kubenswrapper[4894]: E0613 05:41:18.277366 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:33 crc kubenswrapper[4894]: I0613 05:41:33.276392 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:41:33 crc kubenswrapper[4894]: E0613 05:41:33.278849 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:47 crc kubenswrapper[4894]: I0613 05:41:47.317868 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:41:47 crc kubenswrapper[4894]: E0613 05:41:47.318863 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.834114 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:41:52 crc kubenswrapper[4894]: E0613 05:41:52.837061 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61eb18b-42c2-4636-a680-2e31aa304d23" containerName="container-00" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.837238 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61eb18b-42c2-4636-a680-2e31aa304d23" containerName="container-00" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.840630 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61eb18b-42c2-4636-a680-2e31aa304d23" containerName="container-00" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.843040 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.869896 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.880113 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.880748 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.880870 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2w9\" (UniqueName: \"kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.982491 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.982642 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.982698 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2w9\" (UniqueName: \"kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.983140 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:52 crc kubenswrapper[4894]: I0613 05:41:52.983215 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.005474 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2w9\" (UniqueName: \"kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9\") pod \"certified-operators-ddlvc\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.180894 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.652370 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:41:53 crc kubenswrapper[4894]: W0613 05:41:53.663977 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e34b728_b291_4d69_92cd_053a9aa8faed.slice/crio-529de631577c57ab1af4c49978a5a154bbf4cb269ac1f503c3c2ce1df6adcf13 WatchSource:0}: Error finding container 529de631577c57ab1af4c49978a5a154bbf4cb269ac1f503c3c2ce1df6adcf13: Status 404 returned error can't find the container with id 529de631577c57ab1af4c49978a5a154bbf4cb269ac1f503c3c2ce1df6adcf13 Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.948947 4894 generic.go:334] "Generic (PLEG): container finished" podID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerID="a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745" exitCode=0 Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.949021 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerDied","Data":"a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745"} Jun 13 05:41:53 crc kubenswrapper[4894]: I0613 05:41:53.949087 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerStarted","Data":"529de631577c57ab1af4c49978a5a154bbf4cb269ac1f503c3c2ce1df6adcf13"} Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.225249 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.232540 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.239240 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.415428 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.415574 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwpz\" (UniqueName: \"kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.415640 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.517822 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwpz\" (UniqueName: \"kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.517897 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.518119 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.518950 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.519736 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.550965 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwpz\" (UniqueName: \"kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz\") pod \"community-operators-fsfrh\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.576413 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:41:54 crc kubenswrapper[4894]: I0613 05:41:54.959460 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerStarted","Data":"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7"} Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.087565 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.225250 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.227603 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.253645 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.333509 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmcb\" (UniqueName: \"kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.333574 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.333595 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.434984 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmcb\" (UniqueName: \"kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.435057 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.435074 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.435538 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.435581 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.453961 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmcb\" (UniqueName: \"kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb\") pod \"redhat-marketplace-xgwbv\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.548219 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.977728 4894 generic.go:334] "Generic (PLEG): container finished" podID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerID="9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7" exitCode=0 Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.977754 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerDied","Data":"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7"} Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.978163 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.980801 4894 generic.go:334] "Generic (PLEG): container finished" podID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerID="2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195" exitCode=0 Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.980870 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerDied","Data":"2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195"} Jun 13 05:41:55 crc kubenswrapper[4894]: I0613 05:41:55.980920 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerStarted","Data":"0844f83b4dabf8f10bfbc8a6b9eb21769e0f50bf693583ba154a1d41a0850a6b"} Jun 13 05:41:56 crc kubenswrapper[4894]: I0613 05:41:56.991174 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerStarted","Data":"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8"} Jun 13 05:41:56 crc kubenswrapper[4894]: I0613 05:41:56.992569 4894 generic.go:334] "Generic (PLEG): container finished" podID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerID="964409e0972bc845be30478f4d04f1273ea6f548dc1c7aaafc8642727761b337" exitCode=0 Jun 13 05:41:56 crc kubenswrapper[4894]: I0613 05:41:56.992639 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerDied","Data":"964409e0972bc845be30478f4d04f1273ea6f548dc1c7aaafc8642727761b337"} Jun 13 05:41:56 crc kubenswrapper[4894]: I0613 05:41:56.992678 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerStarted","Data":"2cebcfdb4a82ec475b921e418c5aadc95d17d71de733c1cda7e59af829b80459"} Jun 13 05:41:56 crc kubenswrapper[4894]: I0613 05:41:56.996785 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerStarted","Data":"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85"} Jun 13 05:41:57 crc kubenswrapper[4894]: I0613 05:41:57.071911 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ddlvc" podStartSLOduration=2.578352625 podStartE2EDuration="5.071890822s" podCreationTimestamp="2025-06-13 05:41:52 +0000 UTC" firstStartedPulling="2025-06-13 05:41:53.951621141 +0000 UTC m=+3072.397868644" lastFinishedPulling="2025-06-13 05:41:56.445159338 +0000 UTC m=+3074.891406841" observedRunningTime="2025-06-13 05:41:57.051386182 +0000 UTC m=+3075.497633655" watchObservedRunningTime="2025-06-13 05:41:57.071890822 +0000 UTC m=+3075.518138295" Jun 13 05:41:58 crc kubenswrapper[4894]: I0613 05:41:58.005797 4894 generic.go:334] "Generic (PLEG): container finished" podID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerID="fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8" exitCode=0 Jun 13 05:41:58 crc kubenswrapper[4894]: I0613 05:41:58.005970 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerDied","Data":"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8"} Jun 13 05:41:58 crc kubenswrapper[4894]: I0613 05:41:58.011935 4894 generic.go:334] "Generic (PLEG): container finished" podID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerID="0248b7c441f97f90d9308c622765b6d25d51e4b93363ca19bb27e05e273f1694" exitCode=0 Jun 13 05:41:58 crc kubenswrapper[4894]: I0613 05:41:58.012124 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerDied","Data":"0248b7c441f97f90d9308c622765b6d25d51e4b93363ca19bb27e05e273f1694"} Jun 13 05:41:58 crc kubenswrapper[4894]: I0613 05:41:58.278719 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:41:58 crc kubenswrapper[4894]: E0613 05:41:58.279031 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:41:59 crc kubenswrapper[4894]: I0613 05:41:59.021817 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerStarted","Data":"cfd201caffc7cdf1e621584e5ac9a91c69b157eff2d55fd373dc21124837a63e"} Jun 13 05:41:59 crc kubenswrapper[4894]: I0613 05:41:59.025161 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerStarted","Data":"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0"} Jun 13 05:41:59 crc kubenswrapper[4894]: I0613 05:41:59.043348 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xgwbv" podStartSLOduration=2.5964918900000002 podStartE2EDuration="4.043331932s" podCreationTimestamp="2025-06-13 05:41:55 +0000 UTC" firstStartedPulling="2025-06-13 05:41:56.99504527 +0000 UTC m=+3075.441292733" lastFinishedPulling="2025-06-13 05:41:58.441885302 +0000 UTC m=+3076.888132775" observedRunningTime="2025-06-13 05:41:59.037697602 +0000 UTC m=+3077.483945065" watchObservedRunningTime="2025-06-13 05:41:59.043331932 +0000 UTC m=+3077.489579385" Jun 13 05:41:59 crc kubenswrapper[4894]: I0613 05:41:59.059454 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fsfrh" podStartSLOduration=2.548437727 podStartE2EDuration="5.059440877s" podCreationTimestamp="2025-06-13 05:41:54 +0000 UTC" firstStartedPulling="2025-06-13 05:41:55.990377694 +0000 UTC m=+3074.436625147" lastFinishedPulling="2025-06-13 05:41:58.501380794 +0000 UTC m=+3076.947628297" observedRunningTime="2025-06-13 05:41:59.053030826 +0000 UTC m=+3077.499278289" watchObservedRunningTime="2025-06-13 05:41:59.059440877 +0000 UTC m=+3077.505688340" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.114646 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-gzh2v"] Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.117066 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.118841 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.198237 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb48m\" (UniqueName: \"kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.198366 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.300620 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.300902 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb48m\" (UniqueName: \"kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.300906 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.328916 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb48m\" (UniqueName: \"kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m\") pod \"crc-debug-gzh2v\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: I0613 05:42:02.449715 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-gzh2v" Jun 13 05:42:02 crc kubenswrapper[4894]: W0613 05:42:02.490044 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f814b37_349c_4188_a25a_dfc8ee3e0589.slice/crio-e7ed583d0716699e9c103265f36c721ea12697458678cda9ab44cec9c9fa770f WatchSource:0}: Error finding container e7ed583d0716699e9c103265f36c721ea12697458678cda9ab44cec9c9fa770f: Status 404 returned error can't find the container with id e7ed583d0716699e9c103265f36c721ea12697458678cda9ab44cec9c9fa770f Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.064872 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-gzh2v" event={"ID":"1f814b37-349c-4188-a25a-dfc8ee3e0589","Type":"ContainerStarted","Data":"e2014cb54eac7b02c6ebdafc8643e29cc5ed59f809c7847268ca4b4670218918"} Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.065209 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-gzh2v" event={"ID":"1f814b37-349c-4188-a25a-dfc8ee3e0589","Type":"ContainerStarted","Data":"e7ed583d0716699e9c103265f36c721ea12697458678cda9ab44cec9c9fa770f"} Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.181757 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.181816 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.230813 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:03 crc kubenswrapper[4894]: I0613 05:42:03.252482 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-gzh2v" podStartSLOduration=1.252467397 podStartE2EDuration="1.252467397s" podCreationTimestamp="2025-06-13 05:42:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:42:03.082108312 +0000 UTC m=+3081.528355805" watchObservedRunningTime="2025-06-13 05:42:03.252467397 +0000 UTC m=+3081.698714860" Jun 13 05:42:04 crc kubenswrapper[4894]: I0613 05:42:04.153380 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:04 crc kubenswrapper[4894]: I0613 05:42:04.410749 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:42:04 crc kubenswrapper[4894]: I0613 05:42:04.578845 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:04 crc kubenswrapper[4894]: I0613 05:42:04.578917 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:04 crc kubenswrapper[4894]: I0613 05:42:04.659299 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:05 crc kubenswrapper[4894]: I0613 05:42:05.141649 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:05 crc kubenswrapper[4894]: I0613 05:42:05.549143 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:05 crc kubenswrapper[4894]: I0613 05:42:05.549198 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:05 crc kubenswrapper[4894]: I0613 05:42:05.640592 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.100622 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ddlvc" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="registry-server" containerID="cri-o://1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85" gracePeriod=2 Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.183514 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.577220 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.603579 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j2w9\" (UniqueName: \"kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9\") pod \"5e34b728-b291-4d69-92cd-053a9aa8faed\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.603751 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities\") pod \"5e34b728-b291-4d69-92cd-053a9aa8faed\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.603826 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content\") pod \"5e34b728-b291-4d69-92cd-053a9aa8faed\" (UID: \"5e34b728-b291-4d69-92cd-053a9aa8faed\") " Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.604519 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities" (OuterVolumeSpecName: "utilities") pod "5e34b728-b291-4d69-92cd-053a9aa8faed" (UID: "5e34b728-b291-4d69-92cd-053a9aa8faed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.604838 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.613985 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9" (OuterVolumeSpecName: "kube-api-access-6j2w9") pod "5e34b728-b291-4d69-92cd-053a9aa8faed" (UID: "5e34b728-b291-4d69-92cd-053a9aa8faed"). InnerVolumeSpecName "kube-api-access-6j2w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.631962 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e34b728-b291-4d69-92cd-053a9aa8faed" (UID: "5e34b728-b291-4d69-92cd-053a9aa8faed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.707311 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j2w9\" (UniqueName: \"kubernetes.io/projected/5e34b728-b291-4d69-92cd-053a9aa8faed-kube-api-access-6j2w9\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:06 crc kubenswrapper[4894]: I0613 05:42:06.707349 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34b728-b291-4d69-92cd-053a9aa8faed-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.012965 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.118785 4894 generic.go:334] "Generic (PLEG): container finished" podID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerID="1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85" exitCode=0 Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.118883 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddlvc" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.118925 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerDied","Data":"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85"} Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.119205 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddlvc" event={"ID":"5e34b728-b291-4d69-92cd-053a9aa8faed","Type":"ContainerDied","Data":"529de631577c57ab1af4c49978a5a154bbf4cb269ac1f503c3c2ce1df6adcf13"} Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.119239 4894 scope.go:117] "RemoveContainer" containerID="1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.121166 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fsfrh" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="registry-server" containerID="cri-o://a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0" gracePeriod=2 Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.171847 4894 scope.go:117] "RemoveContainer" containerID="9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.212609 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.228407 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ddlvc"] Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.253682 4894 scope.go:117] "RemoveContainer" containerID="a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.334184 4894 scope.go:117] "RemoveContainer" containerID="1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85" Jun 13 05:42:07 crc kubenswrapper[4894]: E0613 05:42:07.334626 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85\": container with ID starting with 1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85 not found: ID does not exist" containerID="1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.334760 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85"} err="failed to get container status \"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85\": rpc error: code = NotFound desc = could not find container \"1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85\": container with ID starting with 1d6de6e839bc0b4202936d2b6c6918be2abaec3c7bf7b28bf97e3258226d5a85 not found: ID does not exist" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.334790 4894 scope.go:117] "RemoveContainer" containerID="9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7" Jun 13 05:42:07 crc kubenswrapper[4894]: E0613 05:42:07.335152 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7\": container with ID starting with 9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7 not found: ID does not exist" containerID="9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.335181 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7"} err="failed to get container status \"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7\": rpc error: code = NotFound desc = could not find container \"9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7\": container with ID starting with 9daf6bd0beef31f825948c51eab05459a9ed3b2813dcc24768a6c32d674412c7 not found: ID does not exist" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.335201 4894 scope.go:117] "RemoveContainer" containerID="a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745" Jun 13 05:42:07 crc kubenswrapper[4894]: E0613 05:42:07.335511 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745\": container with ID starting with a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745 not found: ID does not exist" containerID="a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.335539 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745"} err="failed to get container status \"a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745\": rpc error: code = NotFound desc = could not find container \"a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745\": container with ID starting with a30276e512bd88e7405e2818ee248c02dfd0924efa801d6cc54138cd897ed745 not found: ID does not exist" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.573841 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.629564 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content\") pod \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.629779 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities\") pod \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.629828 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbwpz\" (UniqueName: \"kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz\") pod \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\" (UID: \"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e\") " Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.631010 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities" (OuterVolumeSpecName: "utilities") pod "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" (UID: "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.635725 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz" (OuterVolumeSpecName: "kube-api-access-tbwpz") pod "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" (UID: "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e"). InnerVolumeSpecName "kube-api-access-tbwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.673560 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" (UID: "f8fe0b72-6279-4cfd-bf97-dd4f0b42646e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.732140 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.732189 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbwpz\" (UniqueName: \"kubernetes.io/projected/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-kube-api-access-tbwpz\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:07 crc kubenswrapper[4894]: I0613 05:42:07.732212 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.136288 4894 generic.go:334] "Generic (PLEG): container finished" podID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerID="a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0" exitCode=0 Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.136383 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerDied","Data":"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0"} Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.136424 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsfrh" event={"ID":"f8fe0b72-6279-4cfd-bf97-dd4f0b42646e","Type":"ContainerDied","Data":"0844f83b4dabf8f10bfbc8a6b9eb21769e0f50bf693583ba154a1d41a0850a6b"} Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.136452 4894 scope.go:117] "RemoveContainer" containerID="a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.136585 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsfrh" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.160690 4894 scope.go:117] "RemoveContainer" containerID="fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.189892 4894 scope.go:117] "RemoveContainer" containerID="2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.198943 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.208671 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fsfrh"] Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.210273 4894 scope.go:117] "RemoveContainer" containerID="a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0" Jun 13 05:42:08 crc kubenswrapper[4894]: E0613 05:42:08.210610 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0\": container with ID starting with a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0 not found: ID does not exist" containerID="a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.210642 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0"} err="failed to get container status \"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0\": rpc error: code = NotFound desc = could not find container \"a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0\": container with ID starting with a6b1fde32e297149683b55e80ee3624592f1bebfd3cfd0864c1d47d9fd81d5a0 not found: ID does not exist" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.210680 4894 scope.go:117] "RemoveContainer" containerID="fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8" Jun 13 05:42:08 crc kubenswrapper[4894]: E0613 05:42:08.211053 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8\": container with ID starting with fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8 not found: ID does not exist" containerID="fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.211075 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8"} err="failed to get container status \"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8\": rpc error: code = NotFound desc = could not find container \"fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8\": container with ID starting with fdb700ffeb2131fac4ecb0e70d397fafcd2c0e2188aed30a02373fc4248cf5a8 not found: ID does not exist" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.211090 4894 scope.go:117] "RemoveContainer" containerID="2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195" Jun 13 05:42:08 crc kubenswrapper[4894]: E0613 05:42:08.211389 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195\": container with ID starting with 2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195 not found: ID does not exist" containerID="2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.211414 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195"} err="failed to get container status \"2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195\": rpc error: code = NotFound desc = could not find container \"2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195\": container with ID starting with 2fae869253beb6812d480f042ad2b0ba8d6465ecd52ab15d635c508b78101195 not found: ID does not exist" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.285990 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" path="/var/lib/kubelet/pods/5e34b728-b291-4d69-92cd-053a9aa8faed/volumes" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.286856 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" path="/var/lib/kubelet/pods/f8fe0b72-6279-4cfd-bf97-dd4f0b42646e/volumes" Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.814739 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:42:08 crc kubenswrapper[4894]: I0613 05:42:08.815371 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xgwbv" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="registry-server" containerID="cri-o://cfd201caffc7cdf1e621584e5ac9a91c69b157eff2d55fd373dc21124837a63e" gracePeriod=2 Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.150193 4894 generic.go:334] "Generic (PLEG): container finished" podID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerID="cfd201caffc7cdf1e621584e5ac9a91c69b157eff2d55fd373dc21124837a63e" exitCode=0 Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.150249 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerDied","Data":"cfd201caffc7cdf1e621584e5ac9a91c69b157eff2d55fd373dc21124837a63e"} Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.230265 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.298825 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdmcb\" (UniqueName: \"kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb\") pod \"93d94d89-a484-49b1-a63e-078b9a58c7d0\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.298874 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content\") pod \"93d94d89-a484-49b1-a63e-078b9a58c7d0\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.299096 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities\") pod \"93d94d89-a484-49b1-a63e-078b9a58c7d0\" (UID: \"93d94d89-a484-49b1-a63e-078b9a58c7d0\") " Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.299737 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities" (OuterVolumeSpecName: "utilities") pod "93d94d89-a484-49b1-a63e-078b9a58c7d0" (UID: "93d94d89-a484-49b1-a63e-078b9a58c7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.300492 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.306015 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb" (OuterVolumeSpecName: "kube-api-access-gdmcb") pod "93d94d89-a484-49b1-a63e-078b9a58c7d0" (UID: "93d94d89-a484-49b1-a63e-078b9a58c7d0"). InnerVolumeSpecName "kube-api-access-gdmcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.310316 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93d94d89-a484-49b1-a63e-078b9a58c7d0" (UID: "93d94d89-a484-49b1-a63e-078b9a58c7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.401985 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdmcb\" (UniqueName: \"kubernetes.io/projected/93d94d89-a484-49b1-a63e-078b9a58c7d0-kube-api-access-gdmcb\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:09 crc kubenswrapper[4894]: I0613 05:42:09.402067 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d94d89-a484-49b1-a63e-078b9a58c7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.174295 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xgwbv" event={"ID":"93d94d89-a484-49b1-a63e-078b9a58c7d0","Type":"ContainerDied","Data":"2cebcfdb4a82ec475b921e418c5aadc95d17d71de733c1cda7e59af829b80459"} Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.174406 4894 scope.go:117] "RemoveContainer" containerID="cfd201caffc7cdf1e621584e5ac9a91c69b157eff2d55fd373dc21124837a63e" Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.174444 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xgwbv" Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.210930 4894 scope.go:117] "RemoveContainer" containerID="0248b7c441f97f90d9308c622765b6d25d51e4b93363ca19bb27e05e273f1694" Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.225897 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.234249 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xgwbv"] Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.249473 4894 scope.go:117] "RemoveContainer" containerID="964409e0972bc845be30478f4d04f1273ea6f548dc1c7aaafc8642727761b337" Jun 13 05:42:10 crc kubenswrapper[4894]: I0613 05:42:10.313706 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" path="/var/lib/kubelet/pods/93d94d89-a484-49b1-a63e-078b9a58c7d0/volumes" Jun 13 05:42:12 crc kubenswrapper[4894]: I0613 05:42:12.286883 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:42:12 crc kubenswrapper[4894]: E0613 05:42:12.288055 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.131194 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-gzh2v"] Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.131402 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-gzh2v" podUID="1f814b37-349c-4188-a25a-dfc8ee3e0589" containerName="container-00" containerID="cri-o://e2014cb54eac7b02c6ebdafc8643e29cc5ed59f809c7847268ca4b4670218918" gracePeriod=2 Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.139054 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-gzh2v"] Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.204142 4894 generic.go:334] "Generic (PLEG): container finished" podID="1f814b37-349c-4188-a25a-dfc8ee3e0589" containerID="e2014cb54eac7b02c6ebdafc8643e29cc5ed59f809c7847268ca4b4670218918" exitCode=0 Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.204187 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7ed583d0716699e9c103265f36c721ea12697458678cda9ab44cec9c9fa770f" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.205616 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-gzh2v" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.280117 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host\") pod \"1f814b37-349c-4188-a25a-dfc8ee3e0589\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.280268 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb48m\" (UniqueName: \"kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m\") pod \"1f814b37-349c-4188-a25a-dfc8ee3e0589\" (UID: \"1f814b37-349c-4188-a25a-dfc8ee3e0589\") " Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.281097 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host" (OuterVolumeSpecName: "host") pod "1f814b37-349c-4188-a25a-dfc8ee3e0589" (UID: "1f814b37-349c-4188-a25a-dfc8ee3e0589"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.310968 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m" (OuterVolumeSpecName: "kube-api-access-qb48m") pod "1f814b37-349c-4188-a25a-dfc8ee3e0589" (UID: "1f814b37-349c-4188-a25a-dfc8ee3e0589"). InnerVolumeSpecName "kube-api-access-qb48m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.382879 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f814b37-349c-4188-a25a-dfc8ee3e0589-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:13 crc kubenswrapper[4894]: I0613 05:42:13.382905 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb48m\" (UniqueName: \"kubernetes.io/projected/1f814b37-349c-4188-a25a-dfc8ee3e0589-kube-api-access-qb48m\") on node \"crc\" DevicePath \"\"" Jun 13 05:42:14 crc kubenswrapper[4894]: I0613 05:42:14.211524 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-gzh2v" Jun 13 05:42:14 crc kubenswrapper[4894]: I0613 05:42:14.288751 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f814b37-349c-4188-a25a-dfc8ee3e0589" path="/var/lib/kubelet/pods/1f814b37-349c-4188-a25a-dfc8ee3e0589/volumes" Jun 13 05:42:25 crc kubenswrapper[4894]: I0613 05:42:25.276829 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:42:25 crc kubenswrapper[4894]: E0613 05:42:25.277454 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:42:38 crc kubenswrapper[4894]: I0613 05:42:38.281675 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:42:38 crc kubenswrapper[4894]: E0613 05:42:38.282626 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:42:49 crc kubenswrapper[4894]: I0613 05:42:49.277901 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:42:49 crc kubenswrapper[4894]: E0613 05:42:49.278999 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.277222 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.277832 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.582800 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-4rk44"] Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583288 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583337 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583366 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583379 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583397 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583411 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583431 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583443 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="extract-utilities" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583475 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583486 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583514 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583528 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583552 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583564 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583584 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583598 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583616 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f814b37-349c-4188-a25a-dfc8ee3e0589" containerName="container-00" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583628 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f814b37-349c-4188-a25a-dfc8ee3e0589" containerName="container-00" Jun 13 05:43:01 crc kubenswrapper[4894]: E0613 05:43:01.583643 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583679 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="extract-content" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.583993 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f814b37-349c-4188-a25a-dfc8ee3e0589" containerName="container-00" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.584022 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e34b728-b291-4d69-92cd-053a9aa8faed" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.584047 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fe0b72-6279-4cfd-bf97-dd4f0b42646e" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.584075 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d94d89-a484-49b1-a63e-078b9a58c7d0" containerName="registry-server" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.585071 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.587983 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.736793 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.736854 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrpv\" (UniqueName: \"kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.838622 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.838732 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrpv\" (UniqueName: \"kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.838792 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.870295 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrpv\" (UniqueName: \"kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv\") pod \"crc-debug-4rk44\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " pod="openstack/crc-debug-4rk44" Jun 13 05:43:01 crc kubenswrapper[4894]: I0613 05:43:01.914844 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4rk44" Jun 13 05:43:02 crc kubenswrapper[4894]: I0613 05:43:02.700103 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4rk44" event={"ID":"87ccf4d7-14a0-41af-8902-0e5b6b2fa182","Type":"ContainerStarted","Data":"2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea"} Jun 13 05:43:02 crc kubenswrapper[4894]: I0613 05:43:02.701567 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4rk44" event={"ID":"87ccf4d7-14a0-41af-8902-0e5b6b2fa182","Type":"ContainerStarted","Data":"ae5a508b3c65608a1ee62e714bb3b54050025fc2ec813a550d969d6486ebf6b5"} Jun 13 05:43:02 crc kubenswrapper[4894]: I0613 05:43:02.730650 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-4rk44" podStartSLOduration=1.730627815 podStartE2EDuration="1.730627815s" podCreationTimestamp="2025-06-13 05:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:43:02.724364688 +0000 UTC m=+3141.170612181" watchObservedRunningTime="2025-06-13 05:43:02.730627815 +0000 UTC m=+3141.176875288" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.589937 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-4rk44"] Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.590775 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-4rk44" podUID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" containerName="container-00" containerID="cri-o://2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea" gracePeriod=2 Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.607625 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-4rk44"] Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.693867 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4rk44" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.784669 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host\") pod \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.785191 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrpv\" (UniqueName: \"kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv\") pod \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\" (UID: \"87ccf4d7-14a0-41af-8902-0e5b6b2fa182\") " Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.784780 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host" (OuterVolumeSpecName: "host") pod "87ccf4d7-14a0-41af-8902-0e5b6b2fa182" (UID: "87ccf4d7-14a0-41af-8902-0e5b6b2fa182"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.796195 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv" (OuterVolumeSpecName: "kube-api-access-mdrpv") pod "87ccf4d7-14a0-41af-8902-0e5b6b2fa182" (UID: "87ccf4d7-14a0-41af-8902-0e5b6b2fa182"). InnerVolumeSpecName "kube-api-access-mdrpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.811720 4894 generic.go:334] "Generic (PLEG): container finished" podID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" containerID="2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea" exitCode=0 Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.811784 4894 scope.go:117] "RemoveContainer" containerID="2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.811857 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4rk44" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.874056 4894 scope.go:117] "RemoveContainer" containerID="2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea" Jun 13 05:43:12 crc kubenswrapper[4894]: E0613 05:43:12.874745 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea\": container with ID starting with 2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea not found: ID does not exist" containerID="2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.874790 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea"} err="failed to get container status \"2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea\": rpc error: code = NotFound desc = could not find container \"2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea\": container with ID starting with 2dcb0d1183edd2fa137b7f48d50aea68b68cd5fb96c01e8457577098ba5f71ea not found: ID does not exist" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.887423 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:43:12 crc kubenswrapper[4894]: I0613 05:43:12.887449 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrpv\" (UniqueName: \"kubernetes.io/projected/87ccf4d7-14a0-41af-8902-0e5b6b2fa182-kube-api-access-mdrpv\") on node \"crc\" DevicePath \"\"" Jun 13 05:43:14 crc kubenswrapper[4894]: I0613 05:43:14.301787 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" path="/var/lib/kubelet/pods/87ccf4d7-14a0-41af-8902-0e5b6b2fa182/volumes" Jun 13 05:43:15 crc kubenswrapper[4894]: I0613 05:43:15.297353 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:43:15 crc kubenswrapper[4894]: E0613 05:43:15.297577 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:43:29 crc kubenswrapper[4894]: I0613 05:43:29.276328 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:43:29 crc kubenswrapper[4894]: I0613 05:43:29.992835 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846"} Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.096196 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-b52lk"] Jun 13 05:44:02 crc kubenswrapper[4894]: E0613 05:44:02.097238 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" containerName="container-00" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.097253 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" containerName="container-00" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.097479 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ccf4d7-14a0-41af-8902-0e5b6b2fa182" containerName="container-00" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.098204 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.101016 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.268791 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.268913 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k222x\" (UniqueName: \"kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.370527 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.370706 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k222x\" (UniqueName: \"kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.370762 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.394162 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k222x\" (UniqueName: \"kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x\") pod \"crc-debug-b52lk\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: I0613 05:44:02.418979 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b52lk" Jun 13 05:44:02 crc kubenswrapper[4894]: W0613 05:44:02.451985 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadc2fff_5a9a_4fe8_bbcf_ae81d0645aa9.slice/crio-52510f47574a997a9edca9cf6ee5c4eaa6b6fd0b630b03c653ef54b9bd3d51db WatchSource:0}: Error finding container 52510f47574a997a9edca9cf6ee5c4eaa6b6fd0b630b03c653ef54b9bd3d51db: Status 404 returned error can't find the container with id 52510f47574a997a9edca9cf6ee5c4eaa6b6fd0b630b03c653ef54b9bd3d51db Jun 13 05:44:03 crc kubenswrapper[4894]: I0613 05:44:03.346216 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b52lk" event={"ID":"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9","Type":"ContainerStarted","Data":"30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd"} Jun 13 05:44:03 crc kubenswrapper[4894]: I0613 05:44:03.346861 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-b52lk" event={"ID":"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9","Type":"ContainerStarted","Data":"52510f47574a997a9edca9cf6ee5c4eaa6b6fd0b630b03c653ef54b9bd3d51db"} Jun 13 05:44:03 crc kubenswrapper[4894]: I0613 05:44:03.365690 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-b52lk" podStartSLOduration=1.365672962 podStartE2EDuration="1.365672962s" podCreationTimestamp="2025-06-13 05:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:44:03.361179755 +0000 UTC m=+3201.807427218" watchObservedRunningTime="2025-06-13 05:44:03.365672962 +0000 UTC m=+3201.811920425" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.028877 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-b52lk"] Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.029861 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-b52lk" podUID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" containerName="container-00" containerID="cri-o://30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd" gracePeriod=2 Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.042032 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-b52lk"] Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.104896 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b52lk" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.199191 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k222x\" (UniqueName: \"kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x\") pod \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.199541 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host\") pod \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\" (UID: \"dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9\") " Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.199815 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host" (OuterVolumeSpecName: "host") pod "dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" (UID: "dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.200542 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.208286 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x" (OuterVolumeSpecName: "kube-api-access-k222x") pod "dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" (UID: "dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9"). InnerVolumeSpecName "kube-api-access-k222x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.303389 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k222x\" (UniqueName: \"kubernetes.io/projected/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9-kube-api-access-k222x\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.469292 4894 generic.go:334] "Generic (PLEG): container finished" podID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" containerID="30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd" exitCode=0 Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.469374 4894 scope.go:117] "RemoveContainer" containerID="30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.469577 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-b52lk" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.507916 4894 scope.go:117] "RemoveContainer" containerID="30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd" Jun 13 05:44:13 crc kubenswrapper[4894]: E0613 05:44:13.508507 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd\": container with ID starting with 30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd not found: ID does not exist" containerID="30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd" Jun 13 05:44:13 crc kubenswrapper[4894]: I0613 05:44:13.508572 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd"} err="failed to get container status \"30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd\": rpc error: code = NotFound desc = could not find container \"30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd\": container with ID starting with 30a6a2b6195b50cf976b6ac573711c5a2c17fe83b5a36da3cf19ad97303c68fd not found: ID does not exist" Jun 13 05:44:14 crc kubenswrapper[4894]: I0613 05:44:14.291325 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" path="/var/lib/kubelet/pods/dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9/volumes" Jun 13 05:44:42 crc kubenswrapper[4894]: I0613 05:44:42.782768 4894 generic.go:334] "Generic (PLEG): container finished" podID="17cfff45-e7b3-4297-9b08-33a8ea345bc2" containerID="93c447b6ec8f104075f3dd6dffb5aaa2a0e8012e16633b8b3b0a35d416bb837b" exitCode=0 Jun 13 05:44:42 crc kubenswrapper[4894]: I0613 05:44:42.783364 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" event={"ID":"17cfff45-e7b3-4297-9b08-33a8ea345bc2","Type":"ContainerDied","Data":"93c447b6ec8f104075f3dd6dffb5aaa2a0e8012e16633b8b3b0a35d416bb837b"} Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.255188 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381179 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381258 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381377 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqr9\" (UniqueName: \"kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381400 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381443 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381514 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.381535 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.384517 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.384778 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.384816 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.384860 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory\") pod \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\" (UID: \"17cfff45-e7b3-4297-9b08-33a8ea345bc2\") " Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.387445 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9" (OuterVolumeSpecName: "kube-api-access-dnqr9") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "kube-api-access-dnqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.390715 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph" (OuterVolumeSpecName: "ceph") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.392336 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.420016 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.422366 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.424488 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory" (OuterVolumeSpecName: "inventory") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.424521 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.430995 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.434562 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.436975 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.440573 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "17cfff45-e7b3-4297-9b08-33a8ea345bc2" (UID: "17cfff45-e7b3-4297-9b08-33a8ea345bc2"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487307 4894 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487353 4894 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487365 4894 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487380 4894 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487392 4894 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487403 4894 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-inventory\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487414 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487426 4894 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487437 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqr9\" (UniqueName: \"kubernetes.io/projected/17cfff45-e7b3-4297-9b08-33a8ea345bc2-kube-api-access-dnqr9\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487450 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.487464 4894 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfff45-e7b3-4297-9b08-33a8ea345bc2-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.807811 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" event={"ID":"17cfff45-e7b3-4297-9b08-33a8ea345bc2","Type":"ContainerDied","Data":"0c3f35fb700b449b8496ab1dd1522b86da89191bce434b78cf6536ae3e6c4ab6"} Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.807876 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c3f35fb700b449b8496ab1dd1522b86da89191bce434b78cf6536ae3e6c4ab6" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.807965 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.965827 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:44 crc kubenswrapper[4894]: E0613 05:44:44.966210 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cfff45-e7b3-4297-9b08-33a8ea345bc2" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.966227 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cfff45-e7b3-4297-9b08-33a8ea345bc2" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jun 13 05:44:44 crc kubenswrapper[4894]: E0613 05:44:44.966249 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" containerName="container-00" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.966258 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" containerName="container-00" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.966489 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cfff45-e7b3-4297-9b08-33a8ea345bc2" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.966529 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadc2fff-5a9a-4fe8-bbcf-ae81d0645aa9" containerName="container-00" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.968282 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:44 crc kubenswrapper[4894]: I0613 05:44:44.980194 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.111367 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjqr\" (UniqueName: \"kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.111454 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.111485 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.215021 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjqr\" (UniqueName: \"kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.215135 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.215167 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.217158 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.217167 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.249639 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjqr\" (UniqueName: \"kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr\") pod \"redhat-operators-kp5fw\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.287600 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.791861 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:45 crc kubenswrapper[4894]: I0613 05:44:45.842522 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerStarted","Data":"008eb34d8ffa0d340c18ebb37f7ecd49216fdc4123217803ab3a1650caf7edf9"} Jun 13 05:44:46 crc kubenswrapper[4894]: I0613 05:44:46.851817 4894 generic.go:334] "Generic (PLEG): container finished" podID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerID="a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90" exitCode=0 Jun 13 05:44:46 crc kubenswrapper[4894]: I0613 05:44:46.851899 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerDied","Data":"a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90"} Jun 13 05:44:47 crc kubenswrapper[4894]: I0613 05:44:47.863522 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerStarted","Data":"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990"} Jun 13 05:44:48 crc kubenswrapper[4894]: I0613 05:44:48.878284 4894 generic.go:334] "Generic (PLEG): container finished" podID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerID="6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990" exitCode=0 Jun 13 05:44:48 crc kubenswrapper[4894]: I0613 05:44:48.878374 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerDied","Data":"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990"} Jun 13 05:44:49 crc kubenswrapper[4894]: I0613 05:44:49.939970 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerStarted","Data":"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6"} Jun 13 05:44:49 crc kubenswrapper[4894]: I0613 05:44:49.974853 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kp5fw" podStartSLOduration=3.423280951 podStartE2EDuration="5.974825447s" podCreationTimestamp="2025-06-13 05:44:44 +0000 UTC" firstStartedPulling="2025-06-13 05:44:46.853740333 +0000 UTC m=+3245.299987836" lastFinishedPulling="2025-06-13 05:44:49.405284829 +0000 UTC m=+3247.851532332" observedRunningTime="2025-06-13 05:44:49.96645221 +0000 UTC m=+3248.412699703" watchObservedRunningTime="2025-06-13 05:44:49.974825447 +0000 UTC m=+3248.421072940" Jun 13 05:44:55 crc kubenswrapper[4894]: I0613 05:44:55.288068 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:55 crc kubenswrapper[4894]: I0613 05:44:55.288349 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:55 crc kubenswrapper[4894]: I0613 05:44:55.327100 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:56 crc kubenswrapper[4894]: I0613 05:44:56.075988 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:56 crc kubenswrapper[4894]: I0613 05:44:56.135526 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.051902 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kp5fw" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="registry-server" containerID="cri-o://2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6" gracePeriod=2 Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.489238 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.636943 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content\") pod \"ad018da6-2296-4b87-8c0d-67f47c152cc0\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.637237 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities\") pod \"ad018da6-2296-4b87-8c0d-67f47c152cc0\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.637319 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjqr\" (UniqueName: \"kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr\") pod \"ad018da6-2296-4b87-8c0d-67f47c152cc0\" (UID: \"ad018da6-2296-4b87-8c0d-67f47c152cc0\") " Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.638414 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities" (OuterVolumeSpecName: "utilities") pod "ad018da6-2296-4b87-8c0d-67f47c152cc0" (UID: "ad018da6-2296-4b87-8c0d-67f47c152cc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.649060 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr" (OuterVolumeSpecName: "kube-api-access-rqjqr") pod "ad018da6-2296-4b87-8c0d-67f47c152cc0" (UID: "ad018da6-2296-4b87-8c0d-67f47c152cc0"). InnerVolumeSpecName "kube-api-access-rqjqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.727995 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad018da6-2296-4b87-8c0d-67f47c152cc0" (UID: "ad018da6-2296-4b87-8c0d-67f47c152cc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.740092 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.740146 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad018da6-2296-4b87-8c0d-67f47c152cc0-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:58 crc kubenswrapper[4894]: I0613 05:44:58.740168 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjqr\" (UniqueName: \"kubernetes.io/projected/ad018da6-2296-4b87-8c0d-67f47c152cc0-kube-api-access-rqjqr\") on node \"crc\" DevicePath \"\"" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.064612 4894 generic.go:334] "Generic (PLEG): container finished" podID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerID="2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6" exitCode=0 Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.064681 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerDied","Data":"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6"} Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.064686 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp5fw" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.064716 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp5fw" event={"ID":"ad018da6-2296-4b87-8c0d-67f47c152cc0","Type":"ContainerDied","Data":"008eb34d8ffa0d340c18ebb37f7ecd49216fdc4123217803ab3a1650caf7edf9"} Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.064740 4894 scope.go:117] "RemoveContainer" containerID="2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.105680 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.105917 4894 scope.go:117] "RemoveContainer" containerID="6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.113507 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kp5fw"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.122032 4894 scope.go:117] "RemoveContainer" containerID="a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.188683 4894 scope.go:117] "RemoveContainer" containerID="2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6" Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.189999 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6\": container with ID starting with 2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6 not found: ID does not exist" containerID="2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.190045 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6"} err="failed to get container status \"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6\": rpc error: code = NotFound desc = could not find container \"2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6\": container with ID starting with 2656ad95c6b11b60bcc344acad82d6c205912a910a600189f674b1724cb6b9e6 not found: ID does not exist" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.190071 4894 scope.go:117] "RemoveContainer" containerID="6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990" Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.190518 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990\": container with ID starting with 6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990 not found: ID does not exist" containerID="6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.190625 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990"} err="failed to get container status \"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990\": rpc error: code = NotFound desc = could not find container \"6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990\": container with ID starting with 6af330b73af89f85517d236d98395cdae667908b293daff37760debc5db37990 not found: ID does not exist" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.190737 4894 scope.go:117] "RemoveContainer" containerID="a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90" Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.191200 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90\": container with ID starting with a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90 not found: ID does not exist" containerID="a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.191252 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90"} err="failed to get container status \"a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90\": rpc error: code = NotFound desc = could not find container \"a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90\": container with ID starting with a92eb539eeee44019106fbca7d9ae9efacd476ab7507112726783d56fa3a0a90 not found: ID does not exist" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.307770 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.308083 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="extract-utilities" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.308098 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="extract-utilities" Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.308117 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="registry-server" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.308123 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="registry-server" Jun 13 05:44:59 crc kubenswrapper[4894]: E0613 05:44:59.308137 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="extract-content" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.308144 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="extract-content" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.308302 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" containerName="registry-server" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.309124 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.310930 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.311549 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.335469 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.385585 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.386936 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.389417 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.404179 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.451580 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.451938 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.452098 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.452247 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qgcg\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-kube-api-access-4qgcg\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.452397 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.452583 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-localtime\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.452804 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.453446 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.453595 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-run\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.453765 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.453994 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.454516 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.454758 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.454958 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.455161 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.456122 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.456328 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558118 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558165 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-sys\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558198 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558219 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558241 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-ceph\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558258 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f2cb\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-kube-api-access-9f2cb\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558277 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558291 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-localtime\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558308 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558325 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558340 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558353 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558369 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558391 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558408 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558411 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558425 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558440 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558468 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.558520 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559043 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559165 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559290 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559311 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559368 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qgcg\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-kube-api-access-4qgcg\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559504 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559609 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-dev\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559719 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-localtime\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559867 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559982 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560055 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560134 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560236 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560326 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-run\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.559833 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-localtime\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560542 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560737 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560847 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-run\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.560956 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-run\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561058 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561133 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-scripts\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561212 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561322 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561439 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.561576 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/285093b2-d93f-4e96-86e2-66bfe23a93e2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.567913 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.568036 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.568186 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.568261 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.581606 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285093b2-d93f-4e96-86e2-66bfe23a93e2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.582221 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qgcg\" (UniqueName: \"kubernetes.io/projected/285093b2-d93f-4e96-86e2-66bfe23a93e2-kube-api-access-4qgcg\") pod \"cinder-volume-volume1-0\" (UID: \"285093b2-d93f-4e96-86e2-66bfe23a93e2\") " pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.626921 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663018 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663061 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663102 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-run\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663121 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663140 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-scripts\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663167 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663209 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-sys\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663237 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663258 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-ceph\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663276 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f2cb\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-kube-api-access-9f2cb\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663291 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-localtime\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663312 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663331 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663345 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663368 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663400 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-dev\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663421 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663524 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663559 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-sys\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.663589 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.664001 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.664247 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-localtime\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.664286 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.664309 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.665757 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-dev\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.665827 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-run\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.665852 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.665903 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/768fd773-29d0-4a76-9b25-aa40764378a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.672443 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-ceph\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.673183 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.674218 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-scripts\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.674342 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.674962 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768fd773-29d0-4a76-9b25-aa40764378a0-config-data\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.682514 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f2cb\" (UniqueName: \"kubernetes.io/projected/768fd773-29d0-4a76-9b25-aa40764378a0-kube-api-access-9f2cb\") pod \"cinder-backup-0\" (UID: \"768fd773-29d0-4a76-9b25-aa40764378a0\") " pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.708135 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.967089 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-lg282"] Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.972245 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lg282" Jun 13 05:44:59 crc kubenswrapper[4894]: I0613 05:44:59.980927 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lg282"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.075034 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwj7\" (UniqueName: \"kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7\") pod \"manila-db-create-lg282\" (UID: \"2288fc24-1bb7-4f72-bfbf-bab43156306e\") " pod="openstack/manila-db-create-lg282" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.136187 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.137282 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.141385 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.141393 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.150940 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.176360 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwj7\" (UniqueName: \"kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7\") pod \"manila-db-create-lg282\" (UID: \"2288fc24-1bb7-4f72-bfbf-bab43156306e\") " pod="openstack/manila-db-create-lg282" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.198112 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwj7\" (UniqueName: \"kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7\") pod \"manila-db-create-lg282\" (UID: \"2288fc24-1bb7-4f72-bfbf-bab43156306e\") " pod="openstack/manila-db-create-lg282" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.216766 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.218512 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.224567 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.224762 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.224873 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.224978 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-snrch" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.225200 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.282632 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dssfk\" (UniqueName: \"kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.282756 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.282849 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.309558 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lg282" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.327005 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad018da6-2296-4b87-8c0d-67f47c152cc0" path="/var/lib/kubelet/pods/ad018da6-2296-4b87-8c0d-67f47c152cc0/volumes" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.328291 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.366922 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.368433 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.397589 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.398006 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.398420 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400644 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dssfk\" (UniqueName: \"kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400758 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400783 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400824 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-logs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400846 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400866 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400881 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400898 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400971 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.400997 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6mw\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-kube-api-access-md6mw\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.401015 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.409344 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.424727 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dssfk\" (UniqueName: \"kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.428973 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume\") pod \"collect-profiles-29163225-bnpft\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.429060 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.453333 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503564 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503613 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-logs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503635 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503674 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503700 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503716 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503744 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503846 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.503970 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-logs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.504000 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.504603 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-logs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.505476 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbf8a2c2-e1d7-4341-b167-9162312c2b97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.505530 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506037 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506137 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6mw\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-kube-api-access-md6mw\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506181 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506295 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506326 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506452 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59t7\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-kube-api-access-m59t7\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506481 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.506534 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.510200 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.513803 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.519730 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.521141 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.521760 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf8a2c2-e1d7-4341-b167-9162312c2b97-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.527417 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6mw\" (UniqueName: \"kubernetes.io/projected/fbf8a2c2-e1d7-4341-b167-9162312c2b97-kube-api-access-md6mw\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.549419 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"fbf8a2c2-e1d7-4341-b167-9162312c2b97\") " pod="openstack/glance-default-external-api-0" Jun 13 05:45:00 crc kubenswrapper[4894]: I0613 05:45:00.588038 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608059 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608315 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608364 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59t7\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-kube-api-access-m59t7\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608383 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608416 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608439 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608470 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608503 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-logs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608527 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.608685 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.618621 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.620047 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52594d15-d5e4-432c-8125-d9e5ed137ad3-logs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.626692 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.628465 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.652254 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.652368 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52594d15-d5e4-432c-8125-d9e5ed137ad3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.652572 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.653119 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59t7\" (UniqueName: \"kubernetes.io/projected/52594d15-d5e4-432c-8125-d9e5ed137ad3-kube-api-access-m59t7\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.658240 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"52594d15-d5e4-432c-8125-d9e5ed137ad3\") " pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.765317 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:00.836632 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lg282"] Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.104370 4894 generic.go:334] "Generic (PLEG): container finished" podID="2288fc24-1bb7-4f72-bfbf-bab43156306e" containerID="70c8cd870760fb9644dfc2cc9acd46b78d94c5117f5172196a2c79f7aec7e5fb" exitCode=0 Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.104467 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lg282" event={"ID":"2288fc24-1bb7-4f72-bfbf-bab43156306e","Type":"ContainerDied","Data":"70c8cd870760fb9644dfc2cc9acd46b78d94c5117f5172196a2c79f7aec7e5fb"} Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.104702 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lg282" event={"ID":"2288fc24-1bb7-4f72-bfbf-bab43156306e","Type":"ContainerStarted","Data":"76825d52fe4afb28323e1f5883f773a8b13e21750706950a5fa19558b4a77344"} Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.111885 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"285093b2-d93f-4e96-86e2-66bfe23a93e2","Type":"ContainerStarted","Data":"544dd53aa66c43a3fe53abbca244110662b122474b5e8c51189f46d903006bce"} Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.140831 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"768fd773-29d0-4a76-9b25-aa40764378a0","Type":"ContainerStarted","Data":"736bafe92d25d28af8069a6f0bea9c9811983aad2a4e8e90849baf572a804c05"} Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.448136 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-pghtv"] Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.449356 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.452569 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.543863 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmqt\" (UniqueName: \"kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.543972 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.645874 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.645917 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.646007 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmqt\" (UniqueName: \"kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.666291 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft"] Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.688547 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmqt\" (UniqueName: \"kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt\") pod \"crc-debug-pghtv\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.774049 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-pghtv" Jun 13 05:45:01 crc kubenswrapper[4894]: I0613 05:45:01.782224 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.126440 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jun 13 05:45:02 crc kubenswrapper[4894]: W0613 05:45:02.136825 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbf8a2c2_e1d7_4341_b167_9162312c2b97.slice/crio-b81f9d1aa34a00434aa6533c6edf7c3db4e939c9013f08ee559d4ca79cd034b3 WatchSource:0}: Error finding container b81f9d1aa34a00434aa6533c6edf7c3db4e939c9013f08ee559d4ca79cd034b3: Status 404 returned error can't find the container with id b81f9d1aa34a00434aa6533c6edf7c3db4e939c9013f08ee559d4ca79cd034b3 Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.159350 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-pghtv" event={"ID":"04062b1d-1c42-4e34-857e-d3a6c87cd953","Type":"ContainerStarted","Data":"158fce60ccbb842f65882a4394f3afa932eccfde3471abb1406994fba8deda7a"} Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.160153 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbf8a2c2-e1d7-4341-b167-9162312c2b97","Type":"ContainerStarted","Data":"b81f9d1aa34a00434aa6533c6edf7c3db4e939c9013f08ee559d4ca79cd034b3"} Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.162132 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" event={"ID":"3698a893-850a-4019-807a-0f351a858b35","Type":"ContainerStarted","Data":"2cd728104406c3b7d21625333ad6ca1a6ae2d952ebee1a94c7707d74f9ad2d8f"} Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.162152 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" event={"ID":"3698a893-850a-4019-807a-0f351a858b35","Type":"ContainerStarted","Data":"d45446ddab7c6b3f15351f8079e082679ee90cf0217fe7a0da41dc0e14fa4db0"} Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.166369 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52594d15-d5e4-432c-8125-d9e5ed137ad3","Type":"ContainerStarted","Data":"9c7b6382e13579fd33540a381546812de84f362bdcf7496f71716116b35e3814"} Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.181349 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" podStartSLOduration=2.181333268 podStartE2EDuration="2.181333268s" podCreationTimestamp="2025-06-13 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:02.179007412 +0000 UTC m=+3260.625254875" watchObservedRunningTime="2025-06-13 05:45:02.181333268 +0000 UTC m=+3260.627580721" Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.823603 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lg282" Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.980638 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwj7\" (UniqueName: \"kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7\") pod \"2288fc24-1bb7-4f72-bfbf-bab43156306e\" (UID: \"2288fc24-1bb7-4f72-bfbf-bab43156306e\") " Jun 13 05:45:02 crc kubenswrapper[4894]: I0613 05:45:02.992762 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7" (OuterVolumeSpecName: "kube-api-access-zlwj7") pod "2288fc24-1bb7-4f72-bfbf-bab43156306e" (UID: "2288fc24-1bb7-4f72-bfbf-bab43156306e"). InnerVolumeSpecName "kube-api-access-zlwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.082983 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwj7\" (UniqueName: \"kubernetes.io/projected/2288fc24-1bb7-4f72-bfbf-bab43156306e-kube-api-access-zlwj7\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.193115 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"768fd773-29d0-4a76-9b25-aa40764378a0","Type":"ContainerStarted","Data":"1568db68051d56f3b6cb55df841c1cfeaf2fc1c6aef5131e64d6171aa690c9b9"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.193158 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"768fd773-29d0-4a76-9b25-aa40764378a0","Type":"ContainerStarted","Data":"2ea7033416c49b89c882ffcdeb3fba966ff9e6483982ca89360a0ea959ff578f"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.205467 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-pghtv" event={"ID":"04062b1d-1c42-4e34-857e-d3a6c87cd953","Type":"ContainerStarted","Data":"3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.211386 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lg282" event={"ID":"2288fc24-1bb7-4f72-bfbf-bab43156306e","Type":"ContainerDied","Data":"76825d52fe4afb28323e1f5883f773a8b13e21750706950a5fa19558b4a77344"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.211412 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76825d52fe4afb28323e1f5883f773a8b13e21750706950a5fa19558b4a77344" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.211474 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lg282" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.221258 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.157507495 podStartE2EDuration="4.22124129s" podCreationTimestamp="2025-06-13 05:44:59 +0000 UTC" firstStartedPulling="2025-06-13 05:45:00.44207614 +0000 UTC m=+3258.888323603" lastFinishedPulling="2025-06-13 05:45:01.505809935 +0000 UTC m=+3259.952057398" observedRunningTime="2025-06-13 05:45:03.215045774 +0000 UTC m=+3261.661293237" watchObservedRunningTime="2025-06-13 05:45:03.22124129 +0000 UTC m=+3261.667488753" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.235887 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-pghtv" podStartSLOduration=2.235872643 podStartE2EDuration="2.235872643s" podCreationTimestamp="2025-06-13 05:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:03.23082216 +0000 UTC m=+3261.677069613" watchObservedRunningTime="2025-06-13 05:45:03.235872643 +0000 UTC m=+3261.682120106" Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.255333 4894 generic.go:334] "Generic (PLEG): container finished" podID="3698a893-850a-4019-807a-0f351a858b35" containerID="2cd728104406c3b7d21625333ad6ca1a6ae2d952ebee1a94c7707d74f9ad2d8f" exitCode=0 Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.255399 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" event={"ID":"3698a893-850a-4019-807a-0f351a858b35","Type":"ContainerDied","Data":"2cd728104406c3b7d21625333ad6ca1a6ae2d952ebee1a94c7707d74f9ad2d8f"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.265109 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"285093b2-d93f-4e96-86e2-66bfe23a93e2","Type":"ContainerStarted","Data":"826cb11d8941691df7fa30cf88f316db2882f3139f8a052ee9b6fb56c9eb9a55"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.265149 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"285093b2-d93f-4e96-86e2-66bfe23a93e2","Type":"ContainerStarted","Data":"a0b0beffd4d071772555fc3c1bf62e3ceeb631fecf4c6e01c8a9edf2c2bb9c04"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.269914 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52594d15-d5e4-432c-8125-d9e5ed137ad3","Type":"ContainerStarted","Data":"7bce97933e7c36aeca6bd644757a58404ca08ec84d2cc51aa3ce10a952ebab2a"} Jun 13 05:45:03 crc kubenswrapper[4894]: I0613 05:45:03.310061 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.155058265 podStartE2EDuration="4.310044129s" podCreationTimestamp="2025-06-13 05:44:59 +0000 UTC" firstStartedPulling="2025-06-13 05:45:00.328965893 +0000 UTC m=+3258.775213356" lastFinishedPulling="2025-06-13 05:45:01.483951757 +0000 UTC m=+3259.930199220" observedRunningTime="2025-06-13 05:45:03.300885761 +0000 UTC m=+3261.747133224" watchObservedRunningTime="2025-06-13 05:45:03.310044129 +0000 UTC m=+3261.756291592" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.286406 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbf8a2c2-e1d7-4341-b167-9162312c2b97","Type":"ContainerStarted","Data":"1e36a7ccc75f1e503543e8cf88896022922a7ed06f759c54d0924c412adcc1fb"} Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.286686 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbf8a2c2-e1d7-4341-b167-9162312c2b97","Type":"ContainerStarted","Data":"4a1136d113615138b292e245054f0b79a2a44241311c71186c33936024723a8f"} Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.286777 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52594d15-d5e4-432c-8125-d9e5ed137ad3","Type":"ContainerStarted","Data":"4fb7d73aaffa49b28c7ee44bf115ed1a52ec7edbaadc41b40b3d375d93fd14f7"} Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.300442 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.3004244400000005 podStartE2EDuration="5.30042444s" podCreationTimestamp="2025-06-13 05:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:04.295760929 +0000 UTC m=+3262.742008392" watchObservedRunningTime="2025-06-13 05:45:04.30042444 +0000 UTC m=+3262.746671903" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.325679 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.325649473 podStartE2EDuration="5.325649473s" podCreationTimestamp="2025-06-13 05:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:04.319190271 +0000 UTC m=+3262.765437754" watchObservedRunningTime="2025-06-13 05:45:04.325649473 +0000 UTC m=+3262.771896926" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.627006 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.708306 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.766417 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.830872 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dssfk\" (UniqueName: \"kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk\") pod \"3698a893-850a-4019-807a-0f351a858b35\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.831021 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume\") pod \"3698a893-850a-4019-807a-0f351a858b35\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.831094 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume\") pod \"3698a893-850a-4019-807a-0f351a858b35\" (UID: \"3698a893-850a-4019-807a-0f351a858b35\") " Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.833971 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume" (OuterVolumeSpecName: "config-volume") pod "3698a893-850a-4019-807a-0f351a858b35" (UID: "3698a893-850a-4019-807a-0f351a858b35"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.837209 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3698a893-850a-4019-807a-0f351a858b35" (UID: "3698a893-850a-4019-807a-0f351a858b35"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.842075 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk" (OuterVolumeSpecName: "kube-api-access-dssfk") pod "3698a893-850a-4019-807a-0f351a858b35" (UID: "3698a893-850a-4019-807a-0f351a858b35"). InnerVolumeSpecName "kube-api-access-dssfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.935911 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dssfk\" (UniqueName: \"kubernetes.io/projected/3698a893-850a-4019-807a-0f351a858b35-kube-api-access-dssfk\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.935957 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3698a893-850a-4019-807a-0f351a858b35-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:04 crc kubenswrapper[4894]: I0613 05:45:04.935970 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3698a893-850a-4019-807a-0f351a858b35-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:05 crc kubenswrapper[4894]: I0613 05:45:05.297657 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" event={"ID":"3698a893-850a-4019-807a-0f351a858b35","Type":"ContainerDied","Data":"d45446ddab7c6b3f15351f8079e082679ee90cf0217fe7a0da41dc0e14fa4db0"} Jun 13 05:45:05 crc kubenswrapper[4894]: I0613 05:45:05.307028 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45446ddab7c6b3f15351f8079e082679ee90cf0217fe7a0da41dc0e14fa4db0" Jun 13 05:45:05 crc kubenswrapper[4894]: I0613 05:45:05.299474 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163225-bnpft" Jun 13 05:45:05 crc kubenswrapper[4894]: I0613 05:45:05.851075 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp"] Jun 13 05:45:05 crc kubenswrapper[4894]: I0613 05:45:05.859001 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163180-6wgjp"] Jun 13 05:45:06 crc kubenswrapper[4894]: I0613 05:45:06.286789 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279dc827-6e3f-44e7-b36a-77e117eb9f07" path="/var/lib/kubelet/pods/279dc827-6e3f-44e7-b36a-77e117eb9f07/volumes" Jun 13 05:45:09 crc kubenswrapper[4894]: I0613 05:45:09.795364 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.048698 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.126762 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-ba53-account-create-54cs2"] Jun 13 05:45:10 crc kubenswrapper[4894]: E0613 05:45:10.127143 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3698a893-850a-4019-807a-0f351a858b35" containerName="collect-profiles" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.127159 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3698a893-850a-4019-807a-0f351a858b35" containerName="collect-profiles" Jun 13 05:45:10 crc kubenswrapper[4894]: E0613 05:45:10.127175 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2288fc24-1bb7-4f72-bfbf-bab43156306e" containerName="mariadb-database-create" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.127182 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2288fc24-1bb7-4f72-bfbf-bab43156306e" containerName="mariadb-database-create" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.127361 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2288fc24-1bb7-4f72-bfbf-bab43156306e" containerName="mariadb-database-create" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.127379 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3698a893-850a-4019-807a-0f351a858b35" containerName="collect-profiles" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.128019 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.136908 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.163714 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-ba53-account-create-54cs2"] Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.254682 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbjk\" (UniqueName: \"kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk\") pod \"manila-ba53-account-create-54cs2\" (UID: \"654e1e96-bce6-4bb5-8437-d316769e5104\") " pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.358890 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbjk\" (UniqueName: \"kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk\") pod \"manila-ba53-account-create-54cs2\" (UID: \"654e1e96-bce6-4bb5-8437-d316769e5104\") " pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.403234 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbjk\" (UniqueName: \"kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk\") pod \"manila-ba53-account-create-54cs2\" (UID: \"654e1e96-bce6-4bb5-8437-d316769e5104\") " pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.450132 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.588577 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.588625 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.660089 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.674638 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.766733 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.767030 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.794325 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.803259 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:10 crc kubenswrapper[4894]: I0613 05:45:10.967475 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-ba53-account-create-54cs2"] Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.412592 4894 generic.go:334] "Generic (PLEG): container finished" podID="654e1e96-bce6-4bb5-8437-d316769e5104" containerID="0f3727ce100f25da2de288825900b7609beb440e4f8fa1dd6fd2d6e9cad40dd9" exitCode=0 Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.412698 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-ba53-account-create-54cs2" event={"ID":"654e1e96-bce6-4bb5-8437-d316769e5104","Type":"ContainerDied","Data":"0f3727ce100f25da2de288825900b7609beb440e4f8fa1dd6fd2d6e9cad40dd9"} Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.414585 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-ba53-account-create-54cs2" event={"ID":"654e1e96-bce6-4bb5-8437-d316769e5104","Type":"ContainerStarted","Data":"3a5ea0610a170f9fb6ea51ea5d9a99e321ec63d9f5da305d7bf9fe38cd531ddb"} Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.414609 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.414622 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.414631 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jun 13 05:45:11 crc kubenswrapper[4894]: I0613 05:45:11.414639 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:12 crc kubenswrapper[4894]: I0613 05:45:12.736116 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:12 crc kubenswrapper[4894]: I0613 05:45:12.905770 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jbjk\" (UniqueName: \"kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk\") pod \"654e1e96-bce6-4bb5-8437-d316769e5104\" (UID: \"654e1e96-bce6-4bb5-8437-d316769e5104\") " Jun 13 05:45:12 crc kubenswrapper[4894]: I0613 05:45:12.910811 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk" (OuterVolumeSpecName: "kube-api-access-4jbjk") pod "654e1e96-bce6-4bb5-8437-d316769e5104" (UID: "654e1e96-bce6-4bb5-8437-d316769e5104"). InnerVolumeSpecName "kube-api-access-4jbjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.009031 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jbjk\" (UniqueName: \"kubernetes.io/projected/654e1e96-bce6-4bb5-8437-d316769e5104-kube-api-access-4jbjk\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.145486 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-pghtv"] Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.145760 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-pghtv" podUID="04062b1d-1c42-4e34-857e-d3a6c87cd953" containerName="container-00" containerID="cri-o://3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a" gracePeriod=2 Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.152540 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-pghtv"] Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.222486 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-pghtv" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.312612 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host\") pod \"04062b1d-1c42-4e34-857e-d3a6c87cd953\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.312747 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host" (OuterVolumeSpecName: "host") pod "04062b1d-1c42-4e34-857e-d3a6c87cd953" (UID: "04062b1d-1c42-4e34-857e-d3a6c87cd953"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.312815 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npmqt\" (UniqueName: \"kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt\") pod \"04062b1d-1c42-4e34-857e-d3a6c87cd953\" (UID: \"04062b1d-1c42-4e34-857e-d3a6c87cd953\") " Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.313293 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04062b1d-1c42-4e34-857e-d3a6c87cd953-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.316108 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt" (OuterVolumeSpecName: "kube-api-access-npmqt") pod "04062b1d-1c42-4e34-857e-d3a6c87cd953" (UID: "04062b1d-1c42-4e34-857e-d3a6c87cd953"). InnerVolumeSpecName "kube-api-access-npmqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.415587 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npmqt\" (UniqueName: \"kubernetes.io/projected/04062b1d-1c42-4e34-857e-d3a6c87cd953-kube-api-access-npmqt\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.431565 4894 generic.go:334] "Generic (PLEG): container finished" podID="04062b1d-1c42-4e34-857e-d3a6c87cd953" containerID="3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a" exitCode=0 Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.431690 4894 scope.go:117] "RemoveContainer" containerID="3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.431827 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-pghtv" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.442135 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-ba53-account-create-54cs2" event={"ID":"654e1e96-bce6-4bb5-8437-d316769e5104","Type":"ContainerDied","Data":"3a5ea0610a170f9fb6ea51ea5d9a99e321ec63d9f5da305d7bf9fe38cd531ddb"} Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.442173 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a5ea0610a170f9fb6ea51ea5d9a99e321ec63d9f5da305d7bf9fe38cd531ddb" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.442229 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-ba53-account-create-54cs2" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.442887 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.442917 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.443188 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.443290 4894 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.460566 4894 scope.go:117] "RemoveContainer" containerID="3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a" Jun 13 05:45:13 crc kubenswrapper[4894]: E0613 05:45:13.463361 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a\": container with ID starting with 3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a not found: ID does not exist" containerID="3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a" Jun 13 05:45:13 crc kubenswrapper[4894]: I0613 05:45:13.463413 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a"} err="failed to get container status \"3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a\": rpc error: code = NotFound desc = could not find container \"3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a\": container with ID starting with 3f20ae3c56ae8f99be38f8557014d5926def5f6b7d54ebb148dea699413c346a not found: ID does not exist" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.092109 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.105937 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.107246 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.128969 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.307582 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04062b1d-1c42-4e34-857e-d3a6c87cd953" path="/var/lib/kubelet/pods/04062b1d-1c42-4e34-857e-d3a6c87cd953/volumes" Jun 13 05:45:14 crc kubenswrapper[4894]: I0613 05:45:14.708562 4894 scope.go:117] "RemoveContainer" containerID="85044fda6b7e1b33816bafd58cc20e23f66e54a0e9d71d73a5baff597d29a57d" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.457298 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-mqtjp"] Jun 13 05:45:15 crc kubenswrapper[4894]: E0613 05:45:15.458195 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654e1e96-bce6-4bb5-8437-d316769e5104" containerName="mariadb-account-create" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.458210 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="654e1e96-bce6-4bb5-8437-d316769e5104" containerName="mariadb-account-create" Jun 13 05:45:15 crc kubenswrapper[4894]: E0613 05:45:15.458234 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04062b1d-1c42-4e34-857e-d3a6c87cd953" containerName="container-00" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.458243 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="04062b1d-1c42-4e34-857e-d3a6c87cd953" containerName="container-00" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.458449 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="04062b1d-1c42-4e34-857e-d3a6c87cd953" containerName="container-00" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.458479 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="654e1e96-bce6-4bb5-8437-d316769e5104" containerName="mariadb-account-create" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.459156 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.460943 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-2k6hc" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.461466 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.481676 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-mqtjp"] Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.555026 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.555072 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.555879 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qwb\" (UniqueName: \"kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.555911 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.665063 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.665131 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.665198 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qwb\" (UniqueName: \"kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.665253 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.687142 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.687970 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.695219 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.699248 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qwb\" (UniqueName: \"kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb\") pod \"manila-db-sync-mqtjp\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:15 crc kubenswrapper[4894]: I0613 05:45:15.795854 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:16 crc kubenswrapper[4894]: I0613 05:45:16.539450 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-mqtjp"] Jun 13 05:45:17 crc kubenswrapper[4894]: I0613 05:45:17.524998 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-mqtjp" event={"ID":"472fb412-9ee0-4bdb-b7e0-ec470d468b4b","Type":"ContainerStarted","Data":"ad0444d081755d31fb8d2535390336418663545ba64038310b5551bfff823896"} Jun 13 05:45:22 crc kubenswrapper[4894]: I0613 05:45:22.605690 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-mqtjp" event={"ID":"472fb412-9ee0-4bdb-b7e0-ec470d468b4b","Type":"ContainerStarted","Data":"b6de53990a689386be7d5e33b2267a83e96aac4ef9647b46de766f7a7663f3e4"} Jun 13 05:45:22 crc kubenswrapper[4894]: I0613 05:45:22.631642 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-mqtjp" podStartSLOduration=2.804069133 podStartE2EDuration="7.631619548s" podCreationTimestamp="2025-06-13 05:45:15 +0000 UTC" firstStartedPulling="2025-06-13 05:45:16.56489733 +0000 UTC m=+3275.011144793" lastFinishedPulling="2025-06-13 05:45:21.392447745 +0000 UTC m=+3279.838695208" observedRunningTime="2025-06-13 05:45:22.62144551 +0000 UTC m=+3281.067692973" watchObservedRunningTime="2025-06-13 05:45:22.631619548 +0000 UTC m=+3281.077867041" Jun 13 05:45:32 crc kubenswrapper[4894]: I0613 05:45:32.708230 4894 generic.go:334] "Generic (PLEG): container finished" podID="472fb412-9ee0-4bdb-b7e0-ec470d468b4b" containerID="b6de53990a689386be7d5e33b2267a83e96aac4ef9647b46de766f7a7663f3e4" exitCode=0 Jun 13 05:45:32 crc kubenswrapper[4894]: I0613 05:45:32.708358 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-mqtjp" event={"ID":"472fb412-9ee0-4bdb-b7e0-ec470d468b4b","Type":"ContainerDied","Data":"b6de53990a689386be7d5e33b2267a83e96aac4ef9647b46de766f7a7663f3e4"} Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.297242 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.458879 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle\") pod \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.459148 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data\") pod \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.459213 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data\") pod \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.459273 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qwb\" (UniqueName: \"kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb\") pod \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\" (UID: \"472fb412-9ee0-4bdb-b7e0-ec470d468b4b\") " Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.467476 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb" (OuterVolumeSpecName: "kube-api-access-f2qwb") pod "472fb412-9ee0-4bdb-b7e0-ec470d468b4b" (UID: "472fb412-9ee0-4bdb-b7e0-ec470d468b4b"). InnerVolumeSpecName "kube-api-access-f2qwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.470498 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "472fb412-9ee0-4bdb-b7e0-ec470d468b4b" (UID: "472fb412-9ee0-4bdb-b7e0-ec470d468b4b"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.472341 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data" (OuterVolumeSpecName: "config-data") pod "472fb412-9ee0-4bdb-b7e0-ec470d468b4b" (UID: "472fb412-9ee0-4bdb-b7e0-ec470d468b4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.500632 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "472fb412-9ee0-4bdb-b7e0-ec470d468b4b" (UID: "472fb412-9ee0-4bdb-b7e0-ec470d468b4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.562851 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2qwb\" (UniqueName: \"kubernetes.io/projected/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-kube-api-access-f2qwb\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.562889 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.562901 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.562915 4894 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/472fb412-9ee0-4bdb-b7e0-ec470d468b4b-job-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.735191 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-mqtjp" event={"ID":"472fb412-9ee0-4bdb-b7e0-ec470d468b4b","Type":"ContainerDied","Data":"ad0444d081755d31fb8d2535390336418663545ba64038310b5551bfff823896"} Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.735399 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0444d081755d31fb8d2535390336418663545ba64038310b5551bfff823896" Jun 13 05:45:34 crc kubenswrapper[4894]: I0613 05:45:34.735232 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-mqtjp" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.203838 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: E0613 05:45:35.204187 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472fb412-9ee0-4bdb-b7e0-ec470d468b4b" containerName="manila-db-sync" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.204203 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="472fb412-9ee0-4bdb-b7e0-ec470d468b4b" containerName="manila-db-sync" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.204363 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="472fb412-9ee0-4bdb-b7e0-ec470d468b4b" containerName="manila-db-sync" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.205268 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.227315 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-2k6hc" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.231813 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.231959 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.232116 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.237424 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.238994 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.241380 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.253919 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.277082 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.298361 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-548948d657-2xbqw"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.302783 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.322947 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-548948d657-2xbqw"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.383730 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.385401 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.388216 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-openstack-edpm-ipam\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.389895 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390089 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390122 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390142 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390214 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390269 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390328 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390344 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390368 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkb5\" (UniqueName: \"kubernetes.io/projected/612832fe-9d71-437a-af43-c8c06931a237-kube-api-access-fmkb5\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390394 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmklp\" (UniqueName: \"kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390414 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390440 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgbz\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390478 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-sb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390727 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390768 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-nb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390798 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390822 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390842 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390943 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-config\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.390999 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.391017 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-dns-svc\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.391039 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.421740 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492226 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgbz\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492284 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-sb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492334 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492368 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492388 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492413 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492439 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-nb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492480 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492500 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492517 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492540 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-config\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492563 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492578 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492592 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-dns-svc\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492613 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-587mm\" (UniqueName: \"kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492627 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492642 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-openstack-edpm-ipam\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492693 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492713 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492733 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492748 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492767 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492787 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492814 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492829 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492857 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492873 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492892 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkb5\" (UniqueName: \"kubernetes.io/projected/612832fe-9d71-437a-af43-c8c06931a237-kube-api-access-fmkb5\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492911 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmklp\" (UniqueName: \"kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.492927 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.493000 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.494411 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-sb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.495139 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.495902 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.496250 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.496590 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-config\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.496924 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.497483 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-ovsdbserver-nb\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.497554 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-dns-svc\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.497795 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/612832fe-9d71-437a-af43-c8c06931a237-openstack-edpm-ipam\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.502928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.503939 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.504097 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.504158 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.504613 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.505021 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.507180 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.509396 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.516295 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgbz\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.522082 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkb5\" (UniqueName: \"kubernetes.io/projected/612832fe-9d71-437a-af43-c8c06931a237-kube-api-access-fmkb5\") pod \"dnsmasq-dns-548948d657-2xbqw\" (UID: \"612832fe-9d71-437a-af43-c8c06931a237\") " pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.529209 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.529308 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmklp\" (UniqueName: \"kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp\") pod \"manila-scheduler-0\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " pod="openstack/manila-scheduler-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.564734 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594487 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594536 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594552 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594609 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-587mm\" (UniqueName: \"kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594639 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594665 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594683 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594713 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.594981 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.595314 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.596949 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.602326 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.602460 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.605059 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.608940 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.624418 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-587mm\" (UniqueName: \"kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm\") pod \"manila-api-0\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.632586 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.718449 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:35 crc kubenswrapper[4894]: I0613 05:45:35.827356 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.337047 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-548948d657-2xbqw"] Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.370249 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.482755 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.496432 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.517782 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.785795 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerStarted","Data":"d42b74f701155902696ce32c49f5ddd2fd1a0ac3ad842b8e82755bcf8133b51b"} Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.795258 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerStarted","Data":"1a48d81b1b098c794e4e991d1018d3a8b7c0d23afb603f0e8ebbe026da6d8cea"} Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.808012 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerStarted","Data":"d5e88d3a04e55820964f28973729a561c4e9f6ebdf4781dc22dcbbf88ea7d947"} Jun 13 05:45:36 crc kubenswrapper[4894]: I0613 05:45:36.818070 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548948d657-2xbqw" event={"ID":"612832fe-9d71-437a-af43-c8c06931a237","Type":"ContainerStarted","Data":"88a97a31487863c848eeeb747c21fff78af656b96bb61180f6e434ec64c5126e"} Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.639527 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.641019 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-central-agent" containerID="cri-o://3f6af05f49cdc90780781567aa8c5da8fa435aedfd3e02248ebb3daeac2c7aa9" gracePeriod=30 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.641447 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="proxy-httpd" containerID="cri-o://fd67d6f163b7585f8eb1f335db268f413b8c4e3d89ae1fbb9865187a0f5715b0" gracePeriod=30 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.641564 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="sg-core" containerID="cri-o://8357472a0e3cd45a72f60a4d0236f956f454fc4df10370e57d1e3a455687a26f" gracePeriod=30 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.641678 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-notification-agent" containerID="cri-o://eb6be4eb3829fb0983ce959415ddcf6ed1bc0b48a543c359cc7a054cd85f8529" gracePeriod=30 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.850992 4894 generic.go:334] "Generic (PLEG): container finished" podID="612832fe-9d71-437a-af43-c8c06931a237" containerID="f3c4a2ef29f02bf532630033945bb63f7ad08157e85dcc0cebc264f2d7f14780" exitCode=0 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.851264 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548948d657-2xbqw" event={"ID":"612832fe-9d71-437a-af43-c8c06931a237","Type":"ContainerDied","Data":"f3c4a2ef29f02bf532630033945bb63f7ad08157e85dcc0cebc264f2d7f14780"} Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.900877 4894 generic.go:334] "Generic (PLEG): container finished" podID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerID="8357472a0e3cd45a72f60a4d0236f956f454fc4df10370e57d1e3a455687a26f" exitCode=2 Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.900940 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerDied","Data":"8357472a0e3cd45a72f60a4d0236f956f454fc4df10370e57d1e3a455687a26f"} Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.936807 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerStarted","Data":"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba"} Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.936850 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerStarted","Data":"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4"} Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.937986 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.965644 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.965627293 podStartE2EDuration="2.965627293s" podCreationTimestamp="2025-06-13 05:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:37.962969838 +0000 UTC m=+3296.409217301" watchObservedRunningTime="2025-06-13 05:45:37.965627293 +0000 UTC m=+3296.411874756" Jun 13 05:45:37 crc kubenswrapper[4894]: I0613 05:45:37.991195 4894 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.186:3000/\": dial tcp 10.217.0.186:3000: connect: connection refused" Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.060425 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.950084 4894 generic.go:334] "Generic (PLEG): container finished" podID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerID="fd67d6f163b7585f8eb1f335db268f413b8c4e3d89ae1fbb9865187a0f5715b0" exitCode=0 Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.950530 4894 generic.go:334] "Generic (PLEG): container finished" podID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerID="3f6af05f49cdc90780781567aa8c5da8fa435aedfd3e02248ebb3daeac2c7aa9" exitCode=0 Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.950160 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerDied","Data":"fd67d6f163b7585f8eb1f335db268f413b8c4e3d89ae1fbb9865187a0f5715b0"} Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.950595 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerDied","Data":"3f6af05f49cdc90780781567aa8c5da8fa435aedfd3e02248ebb3daeac2c7aa9"} Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.953729 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerStarted","Data":"66099791f95bedfadf1d140237e95d415d663a0af2e0f880cc40f74950d3ac8e"} Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.953761 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerStarted","Data":"6b0bfc3b3d4b402f752a461154ffe081dbe98fa77c7fc3d36cafeefe86390b2c"} Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.960835 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548948d657-2xbqw" event={"ID":"612832fe-9d71-437a-af43-c8c06931a237","Type":"ContainerStarted","Data":"b24b684631d275a4dd10de670449d34d41efb5db69c35b5ece630d1a1a663596"} Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.960874 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.977279 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.186414424 podStartE2EDuration="3.977266316s" podCreationTimestamp="2025-06-13 05:45:35 +0000 UTC" firstStartedPulling="2025-06-13 05:45:36.540343929 +0000 UTC m=+3294.986591392" lastFinishedPulling="2025-06-13 05:45:37.331195821 +0000 UTC m=+3295.777443284" observedRunningTime="2025-06-13 05:45:38.97176673 +0000 UTC m=+3297.418014193" watchObservedRunningTime="2025-06-13 05:45:38.977266316 +0000 UTC m=+3297.423513779" Jun 13 05:45:38 crc kubenswrapper[4894]: I0613 05:45:38.988788 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-548948d657-2xbqw" podStartSLOduration=3.988771701 podStartE2EDuration="3.988771701s" podCreationTimestamp="2025-06-13 05:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:38.988197545 +0000 UTC m=+3297.434445008" watchObservedRunningTime="2025-06-13 05:45:38.988771701 +0000 UTC m=+3297.435019164" Jun 13 05:45:39 crc kubenswrapper[4894]: I0613 05:45:39.975777 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api-log" containerID="cri-o://87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" gracePeriod=30 Jun 13 05:45:39 crc kubenswrapper[4894]: I0613 05:45:39.975912 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api" containerID="cri-o://74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" gracePeriod=30 Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.584721 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724012 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724210 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724244 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724299 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-587mm\" (UniqueName: \"kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724332 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724350 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724378 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.724396 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts\") pod \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\" (UID: \"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1\") " Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.725502 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.725546 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.725851 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs" (OuterVolumeSpecName: "logs") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.732794 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm" (OuterVolumeSpecName: "kube-api-access-587mm") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "kube-api-access-587mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.733605 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.763879 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts" (OuterVolumeSpecName: "scripts") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.773714 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.800104 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data" (OuterVolumeSpecName: "config-data") pod "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" (UID: "e8bab056-5a3e-4c3f-b88b-ddacdf973fd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827908 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827941 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827952 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-587mm\" (UniqueName: \"kubernetes.io/projected/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-kube-api-access-587mm\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827963 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827975 4894 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-logs\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827983 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827991 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.827998 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.987954 4894 generic.go:334] "Generic (PLEG): container finished" podID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerID="74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" exitCode=0 Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.987981 4894 generic.go:334] "Generic (PLEG): container finished" podID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerID="87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" exitCode=143 Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.987997 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerDied","Data":"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba"} Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.988021 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerDied","Data":"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4"} Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.988032 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e8bab056-5a3e-4c3f-b88b-ddacdf973fd1","Type":"ContainerDied","Data":"d42b74f701155902696ce32c49f5ddd2fd1a0ac3ad842b8e82755bcf8133b51b"} Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.988048 4894 scope.go:117] "RemoveContainer" containerID="74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" Jun 13 05:45:40 crc kubenswrapper[4894]: I0613 05:45:40.988191 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.029421 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.042675 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.060605 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:41 crc kubenswrapper[4894]: E0613 05:45:41.061049 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api-log" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.061061 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api-log" Jun 13 05:45:41 crc kubenswrapper[4894]: E0613 05:45:41.061076 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.061083 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.061335 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api-log" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.061363 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" containerName="manila-api" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.062526 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.068102 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.068489 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.070760 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.076889 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138645 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/081e7394-2489-452b-a7b3-6c12b22200c8-logs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138700 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138731 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138748 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjbx\" (UniqueName: \"kubernetes.io/projected/081e7394-2489-452b-a7b3-6c12b22200c8-kube-api-access-zhjbx\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138779 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138813 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-localtime\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138832 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-scripts\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138848 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data-custom\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138880 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.138918 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-public-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239629 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-localtime\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239678 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-scripts\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239696 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data-custom\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239730 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239770 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-public-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239893 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/081e7394-2489-452b-a7b3-6c12b22200c8-logs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239919 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239938 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239954 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhjbx\" (UniqueName: \"kubernetes.io/projected/081e7394-2489-452b-a7b3-6c12b22200c8-kube-api-access-zhjbx\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.239983 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.240074 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.240107 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/081e7394-2489-452b-a7b3-6c12b22200c8-etc-localtime\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.241105 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/081e7394-2489-452b-a7b3-6c12b22200c8-logs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.248151 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.249284 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.251131 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.251349 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-public-tls-certs\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.258087 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-scripts\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.258572 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/081e7394-2489-452b-a7b3-6c12b22200c8-config-data-custom\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.262509 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhjbx\" (UniqueName: \"kubernetes.io/projected/081e7394-2489-452b-a7b3-6c12b22200c8-kube-api-access-zhjbx\") pod \"manila-api-0\" (UID: \"081e7394-2489-452b-a7b3-6c12b22200c8\") " pod="openstack/manila-api-0" Jun 13 05:45:41 crc kubenswrapper[4894]: I0613 05:45:41.387757 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jun 13 05:45:42 crc kubenswrapper[4894]: I0613 05:45:42.002735 4894 generic.go:334] "Generic (PLEG): container finished" podID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerID="eb6be4eb3829fb0983ce959415ddcf6ed1bc0b48a543c359cc7a054cd85f8529" exitCode=0 Jun 13 05:45:42 crc kubenswrapper[4894]: I0613 05:45:42.002769 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerDied","Data":"eb6be4eb3829fb0983ce959415ddcf6ed1bc0b48a543c359cc7a054cd85f8529"} Jun 13 05:45:42 crc kubenswrapper[4894]: I0613 05:45:42.288164 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bab056-5a3e-4c3f-b88b-ddacdf973fd1" path="/var/lib/kubelet/pods/e8bab056-5a3e-4c3f-b88b-ddacdf973fd1/volumes" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.501244 4894 scope.go:117] "RemoveContainer" containerID="87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.667158 4894 scope.go:117] "RemoveContainer" containerID="74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" Jun 13 05:45:44 crc kubenswrapper[4894]: E0613 05:45:44.669114 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba\": container with ID starting with 74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba not found: ID does not exist" containerID="74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.669292 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba"} err="failed to get container status \"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba\": rpc error: code = NotFound desc = could not find container \"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba\": container with ID starting with 74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba not found: ID does not exist" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.669311 4894 scope.go:117] "RemoveContainer" containerID="87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" Jun 13 05:45:44 crc kubenswrapper[4894]: E0613 05:45:44.669890 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4\": container with ID starting with 87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4 not found: ID does not exist" containerID="87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.669926 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4"} err="failed to get container status \"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4\": rpc error: code = NotFound desc = could not find container \"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4\": container with ID starting with 87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4 not found: ID does not exist" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.669940 4894 scope.go:117] "RemoveContainer" containerID="74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.670145 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba"} err="failed to get container status \"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba\": rpc error: code = NotFound desc = could not find container \"74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba\": container with ID starting with 74c2e4dd9954334d5e4e7c7252409b9f4a8bc811f561366026d095a093d3c9ba not found: ID does not exist" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.670165 4894 scope.go:117] "RemoveContainer" containerID="87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.670336 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4"} err="failed to get container status \"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4\": rpc error: code = NotFound desc = could not find container \"87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4\": container with ID starting with 87ab27d2e00add535752930ba07882fd979145932d7a4fc3e80e0b198c70e1a4 not found: ID does not exist" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.802957 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908249 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908298 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908364 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk7m9\" (UniqueName: \"kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908393 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908419 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908445 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908464 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.908580 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml\") pod \"6e7eb954-01b1-4a00-a57f-ed89b2777572\" (UID: \"6e7eb954-01b1-4a00-a57f-ed89b2777572\") " Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.909628 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.918839 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.963878 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9" (OuterVolumeSpecName: "kube-api-access-qk7m9") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "kube-api-access-qk7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:44 crc kubenswrapper[4894]: I0613 05:45:44.964121 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts" (OuterVolumeSpecName: "scripts") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.033873 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.042789 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk7m9\" (UniqueName: \"kubernetes.io/projected/6e7eb954-01b1-4a00-a57f-ed89b2777572-kube-api-access-qk7m9\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.042823 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.042836 4894 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-log-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.042846 4894 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.042857 4894 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e7eb954-01b1-4a00-a57f-ed89b2777572-run-httpd\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.099022 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e7eb954-01b1-4a00-a57f-ed89b2777572","Type":"ContainerDied","Data":"077889bbf97da40d8df77f017b7649eb02434343a5e2598c53a9bdcaeb6b31ed"} Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.099070 4894 scope.go:117] "RemoveContainer" containerID="fd67d6f163b7585f8eb1f335db268f413b8c4e3d89ae1fbb9865187a0f5715b0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.099129 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.117790 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.122516 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.133743 4894 scope.go:117] "RemoveContainer" containerID="8357472a0e3cd45a72f60a4d0236f956f454fc4df10370e57d1e3a455687a26f" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.147992 4894 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.148113 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.180731 4894 scope.go:117] "RemoveContainer" containerID="eb6be4eb3829fb0983ce959415ddcf6ed1bc0b48a543c359cc7a054cd85f8529" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.196320 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.209489 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data" (OuterVolumeSpecName: "config-data") pod "6e7eb954-01b1-4a00-a57f-ed89b2777572" (UID: "6e7eb954-01b1-4a00-a57f-ed89b2777572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.238016 4894 scope.go:117] "RemoveContainer" containerID="3f6af05f49cdc90780781567aa8c5da8fa435aedfd3e02248ebb3daeac2c7aa9" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.250136 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7eb954-01b1-4a00-a57f-ed89b2777572-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.544021 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.562794 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.571523 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:45 crc kubenswrapper[4894]: E0613 05:45:45.572072 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="proxy-httpd" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572085 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="proxy-httpd" Jun 13 05:45:45 crc kubenswrapper[4894]: E0613 05:45:45.572100 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-central-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572106 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-central-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: E0613 05:45:45.572137 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-notification-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572143 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-notification-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: E0613 05:45:45.572149 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="sg-core" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572154 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="sg-core" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572311 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="proxy-httpd" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572319 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="sg-core" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572336 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-central-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.572346 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" containerName="ceilometer-notification-agent" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.573836 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.576057 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.576329 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.579246 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.584402 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.633793 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-548948d657-2xbqw" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.684768 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.684972 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="dnsmasq-dns" containerID="cri-o://886fbaa3b82a6e72a2127379cf3549221d8674c5af04fabc32456ff527d61166" gracePeriod=10 Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697382 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-log-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697452 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qf58\" (UniqueName: \"kubernetes.io/projected/5ee84851-1e35-483d-9c39-a8f8a0de6f30-kube-api-access-8qf58\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697471 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-config-data\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697495 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697565 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-scripts\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697578 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697635 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.697686 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-run-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.798827 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.798876 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-run-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.798930 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-log-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.798956 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qf58\" (UniqueName: \"kubernetes.io/projected/5ee84851-1e35-483d-9c39-a8f8a0de6f30-kube-api-access-8qf58\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.798977 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-config-data\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.799005 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.799059 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-scripts\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.799073 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.799988 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-run-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.800358 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ee84851-1e35-483d-9c39-a8f8a0de6f30-log-httpd\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.815427 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-config-data\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.821143 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.821804 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.823075 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.828796 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.831830 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qf58\" (UniqueName: \"kubernetes.io/projected/5ee84851-1e35-483d-9c39-a8f8a0de6f30-kube-api-access-8qf58\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.834174 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee84851-1e35-483d-9c39-a8f8a0de6f30-scripts\") pod \"ceilometer-0\" (UID: \"5ee84851-1e35-483d-9c39-a8f8a0de6f30\") " pod="openstack/ceilometer-0" Jun 13 05:45:45 crc kubenswrapper[4894]: I0613 05:45:45.896496 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.145310 4894 generic.go:334] "Generic (PLEG): container finished" podID="f691414e-759b-4a4b-808b-2f079c051452" containerID="886fbaa3b82a6e72a2127379cf3549221d8674c5af04fabc32456ff527d61166" exitCode=0 Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.145632 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" event={"ID":"f691414e-759b-4a4b-808b-2f079c051452","Type":"ContainerDied","Data":"886fbaa3b82a6e72a2127379cf3549221d8674c5af04fabc32456ff527d61166"} Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.148179 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"081e7394-2489-452b-a7b3-6c12b22200c8","Type":"ContainerStarted","Data":"cc4cacc4f143f44e3f7886f1d092bc8af072a5a838321ac95274abcd6494f7d7"} Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.148204 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"081e7394-2489-452b-a7b3-6c12b22200c8","Type":"ContainerStarted","Data":"a4628ef324844be5a7ff3e99238a0beb314042780bd59ccf09ddb12668aab1b6"} Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.149995 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerStarted","Data":"ec966168ad89e055e5c7c3c45345eac9f62fa7caafdcc4dac8165189ed75e9d6"} Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.150019 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerStarted","Data":"7bd55556c72bec83e23651c2eff0a75db55dd5e3a6f1c3e44d2cdcd601648b3a"} Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.178093 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.169071804 podStartE2EDuration="11.178077332s" podCreationTimestamp="2025-06-13 05:45:35 +0000 UTC" firstStartedPulling="2025-06-13 05:45:36.496245942 +0000 UTC m=+3294.942493405" lastFinishedPulling="2025-06-13 05:45:44.50525147 +0000 UTC m=+3302.951498933" observedRunningTime="2025-06-13 05:45:46.176813746 +0000 UTC m=+3304.623061209" watchObservedRunningTime="2025-06-13 05:45:46.178077332 +0000 UTC m=+3304.624324795" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.248490 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311368 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311471 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311490 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h5tm\" (UniqueName: \"kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311577 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311605 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.311634 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config\") pod \"f691414e-759b-4a4b-808b-2f079c051452\" (UID: \"f691414e-759b-4a4b-808b-2f079c051452\") " Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.312052 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7eb954-01b1-4a00-a57f-ed89b2777572" path="/var/lib/kubelet/pods/6e7eb954-01b1-4a00-a57f-ed89b2777572/volumes" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.349896 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm" (OuterVolumeSpecName: "kube-api-access-9h5tm") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "kube-api-access-9h5tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.418469 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h5tm\" (UniqueName: \"kubernetes.io/projected/f691414e-759b-4a4b-808b-2f079c051452-kube-api-access-9h5tm\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.438159 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.501808 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.502835 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.509529 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config" (OuterVolumeSpecName: "config") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.514312 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f691414e-759b-4a4b-808b-2f079c051452" (UID: "f691414e-759b-4a4b-808b-2f079c051452"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.521364 4894 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-dns-svc\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.521388 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.521398 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.521407 4894 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.521415 4894 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f691414e-759b-4a4b-808b-2f079c051452-config\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:46 crc kubenswrapper[4894]: I0613 05:45:46.527549 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.182719 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"081e7394-2489-452b-a7b3-6c12b22200c8","Type":"ContainerStarted","Data":"8ba1edc7a732fffb75fa458da651fb426a53120cdba58081be4b551bedc18a25"} Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.184289 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.190643 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" event={"ID":"f691414e-759b-4a4b-808b-2f079c051452","Type":"ContainerDied","Data":"3844a338da683657d85fcef01033ecd71f61a30ce9c34dd24b0e50b7e2a4b0d7"} Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.190909 4894 scope.go:117] "RemoveContainer" containerID="886fbaa3b82a6e72a2127379cf3549221d8674c5af04fabc32456ff527d61166" Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.190992 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d8754cc-bjfcp" Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.198929 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ee84851-1e35-483d-9c39-a8f8a0de6f30","Type":"ContainerStarted","Data":"322069badd7aaec3465a5c9e7eb35f9b2e43c51027b63d498e0179c3ccc8f8f6"} Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.215154 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.215133683 podStartE2EDuration="6.215133683s" podCreationTimestamp="2025-06-13 05:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:45:47.208772423 +0000 UTC m=+3305.655019886" watchObservedRunningTime="2025-06-13 05:45:47.215133683 +0000 UTC m=+3305.661381146" Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.254135 4894 scope.go:117] "RemoveContainer" containerID="30a1f810122c81cd07847844354f32feb1a8c600c924afb2c6e12a144f859eac" Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.255066 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:45:47 crc kubenswrapper[4894]: I0613 05:45:47.278719 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d8754cc-bjfcp"] Jun 13 05:45:48 crc kubenswrapper[4894]: I0613 05:45:48.209390 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ee84851-1e35-483d-9c39-a8f8a0de6f30","Type":"ContainerStarted","Data":"874be2cb41348efec64300119488206cd0b35de73dc518f6bc34106ae94b3b40"} Jun 13 05:45:48 crc kubenswrapper[4894]: I0613 05:45:48.288548 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f691414e-759b-4a4b-808b-2f079c051452" path="/var/lib/kubelet/pods/f691414e-759b-4a4b-808b-2f079c051452/volumes" Jun 13 05:45:49 crc kubenswrapper[4894]: I0613 05:45:49.217408 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ee84851-1e35-483d-9c39-a8f8a0de6f30","Type":"ContainerStarted","Data":"56ae8e1b39d3679a40b93485fc374f0357f7f664de3be65aac775587f97f1b8c"} Jun 13 05:45:49 crc kubenswrapper[4894]: I0613 05:45:49.217772 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ee84851-1e35-483d-9c39-a8f8a0de6f30","Type":"ContainerStarted","Data":"dec283107d487d402c2243e3ed3ef2794ebc771615ec015ebd5e98e1d2cbe158"} Jun 13 05:45:51 crc kubenswrapper[4894]: I0613 05:45:51.250308 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ee84851-1e35-483d-9c39-a8f8a0de6f30","Type":"ContainerStarted","Data":"a4d8a3ca943d8fe1b75e902abd36d4ce195fcd480c55a6c7705383f95695e68c"} Jun 13 05:45:51 crc kubenswrapper[4894]: I0613 05:45:51.250965 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jun 13 05:45:51 crc kubenswrapper[4894]: I0613 05:45:51.288549 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.715798624 podStartE2EDuration="6.288520075s" podCreationTimestamp="2025-06-13 05:45:45 +0000 UTC" firstStartedPulling="2025-06-13 05:45:46.544470968 +0000 UTC m=+3304.990718421" lastFinishedPulling="2025-06-13 05:45:50.117192379 +0000 UTC m=+3308.563439872" observedRunningTime="2025-06-13 05:45:51.279813329 +0000 UTC m=+3309.726060832" watchObservedRunningTime="2025-06-13 05:45:51.288520075 +0000 UTC m=+3309.734767578" Jun 13 05:45:55 crc kubenswrapper[4894]: I0613 05:45:55.565751 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jun 13 05:45:56 crc kubenswrapper[4894]: I0613 05:45:56.236334 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:45:56 crc kubenswrapper[4894]: I0613 05:45:56.236412 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.168635 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.287245 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.317641 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="manila-share" containerID="cri-o://7bd55556c72bec83e23651c2eff0a75db55dd5e3a6f1c3e44d2cdcd601648b3a" gracePeriod=30 Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.317720 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="probe" containerID="cri-o://ec966168ad89e055e5c7c3c45345eac9f62fa7caafdcc4dac8165189ed75e9d6" gracePeriod=30 Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.405898 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jun 13 05:45:57 crc kubenswrapper[4894]: I0613 05:45:57.443680 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.329239 4894 generic.go:334] "Generic (PLEG): container finished" podID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerID="ec966168ad89e055e5c7c3c45345eac9f62fa7caafdcc4dac8165189ed75e9d6" exitCode=0 Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.329488 4894 generic.go:334] "Generic (PLEG): container finished" podID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerID="7bd55556c72bec83e23651c2eff0a75db55dd5e3a6f1c3e44d2cdcd601648b3a" exitCode=1 Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.329709 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="manila-scheduler" containerID="cri-o://6b0bfc3b3d4b402f752a461154ffe081dbe98fa77c7fc3d36cafeefe86390b2c" gracePeriod=30 Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.329946 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerDied","Data":"ec966168ad89e055e5c7c3c45345eac9f62fa7caafdcc4dac8165189ed75e9d6"} Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.329970 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerDied","Data":"7bd55556c72bec83e23651c2eff0a75db55dd5e3a6f1c3e44d2cdcd601648b3a"} Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.330194 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="probe" containerID="cri-o://66099791f95bedfadf1d140237e95d415d663a0af2e0f880cc40f74950d3ac8e" gracePeriod=30 Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.397151 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587645 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587764 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587841 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587867 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587913 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.587985 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588022 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdgbz\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588049 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588111 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588141 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph\") pod \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\" (UID: \"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf\") " Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588214 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.588229 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.589165 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.589191 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.589204 4894 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-var-lib-manila\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.593677 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph" (OuterVolumeSpecName: "ceph") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.593900 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz" (OuterVolumeSpecName: "kube-api-access-rdgbz") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "kube-api-access-rdgbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.594344 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts" (OuterVolumeSpecName: "scripts") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.594369 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.650848 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.690872 4894 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-ceph\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.690908 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.690919 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.690929 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.690938 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdgbz\" (UniqueName: \"kubernetes.io/projected/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-kube-api-access-rdgbz\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.691446 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data" (OuterVolumeSpecName: "config-data") pod "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" (UID: "79de06b2-e1e2-41a5-bd8f-9d92c1b615bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:45:58 crc kubenswrapper[4894]: I0613 05:45:58.793239 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.343922 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"79de06b2-e1e2-41a5-bd8f-9d92c1b615bf","Type":"ContainerDied","Data":"1a48d81b1b098c794e4e991d1018d3a8b7c0d23afb603f0e8ebbe026da6d8cea"} Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.344161 4894 scope.go:117] "RemoveContainer" containerID="ec966168ad89e055e5c7c3c45345eac9f62fa7caafdcc4dac8165189ed75e9d6" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.343976 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.348479 4894 generic.go:334] "Generic (PLEG): container finished" podID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerID="66099791f95bedfadf1d140237e95d415d663a0af2e0f880cc40f74950d3ac8e" exitCode=0 Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.348501 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerDied","Data":"66099791f95bedfadf1d140237e95d415d663a0af2e0f880cc40f74950d3ac8e"} Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.423957 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.450239 4894 scope.go:117] "RemoveContainer" containerID="7bd55556c72bec83e23651c2eff0a75db55dd5e3a6f1c3e44d2cdcd601648b3a" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.458139 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.475160 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:59 crc kubenswrapper[4894]: E0613 05:45:59.479304 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="probe" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479325 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="probe" Jun 13 05:45:59 crc kubenswrapper[4894]: E0613 05:45:59.479343 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="init" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479351 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="init" Jun 13 05:45:59 crc kubenswrapper[4894]: E0613 05:45:59.479366 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="dnsmasq-dns" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479374 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="dnsmasq-dns" Jun 13 05:45:59 crc kubenswrapper[4894]: E0613 05:45:59.479383 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="manila-share" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479390 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="manila-share" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479757 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="probe" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479775 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f691414e-759b-4a4b-808b-2f079c051452" containerName="dnsmasq-dns" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.479793 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" containerName="manila-share" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.485377 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.487215 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.487246 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.611702 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612360 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtwb\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-kube-api-access-fmtwb\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612491 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-scripts\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612584 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612734 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612830 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.612934 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-ceph\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.613027 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.613132 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715436 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-ceph\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715534 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715617 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715795 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-localtime\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715862 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715899 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtwb\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-kube-api-access-fmtwb\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715958 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-scripts\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.715989 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.716027 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.716071 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.716212 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.716315 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0220044c-1440-4924-b660-c2babe3d6acc-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.720903 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-ceph\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.720928 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-scripts\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.721830 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.722097 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.729497 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0220044c-1440-4924-b660-c2babe3d6acc-config-data\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.748280 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtwb\" (UniqueName: \"kubernetes.io/projected/0220044c-1440-4924-b660-c2babe3d6acc-kube-api-access-fmtwb\") pod \"manila-share-share1-0\" (UID: \"0220044c-1440-4924-b660-c2babe3d6acc\") " pod="openstack/manila-share-share1-0" Jun 13 05:45:59 crc kubenswrapper[4894]: I0613 05:45:59.810123 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jun 13 05:46:00 crc kubenswrapper[4894]: I0613 05:46:00.287152 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79de06b2-e1e2-41a5-bd8f-9d92c1b615bf" path="/var/lib/kubelet/pods/79de06b2-e1e2-41a5-bd8f-9d92c1b615bf/volumes" Jun 13 05:46:00 crc kubenswrapper[4894]: I0613 05:46:00.475085 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.379211 4894 generic.go:334] "Generic (PLEG): container finished" podID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerID="6b0bfc3b3d4b402f752a461154ffe081dbe98fa77c7fc3d36cafeefe86390b2c" exitCode=0 Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.379939 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerDied","Data":"6b0bfc3b3d4b402f752a461154ffe081dbe98fa77c7fc3d36cafeefe86390b2c"} Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.382555 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0220044c-1440-4924-b660-c2babe3d6acc","Type":"ContainerStarted","Data":"57b7939974c24e64ef1f979bc4c1369781402b578525900caa0b02193bb29902"} Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.382581 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0220044c-1440-4924-b660-c2babe3d6acc","Type":"ContainerStarted","Data":"eb1d305191025c4ffe502dfe4ad2bd25bbf2dce5ab07f2095bb065224edad8c3"} Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.551286 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-kv7x4"] Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.553727 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.557006 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.582858 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.583068 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf2r\" (UniqueName: \"kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.684357 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.684475 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.684762 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf2r\" (UniqueName: \"kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.684591 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.702281 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf2r\" (UniqueName: \"kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r\") pod \"crc-debug-kv7x4\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " pod="openstack/crc-debug-kv7x4" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788217 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788418 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788898 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmklp\" (UniqueName: \"kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788939 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788965 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788980 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.788996 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts\") pod \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\" (UID: \"bb3eae76-6a5e-4a98-895b-32aeadcc201e\") " Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.789728 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime" (OuterVolumeSpecName: "etc-localtime") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "etc-localtime". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.789777 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.797197 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.799827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp" (OuterVolumeSpecName: "kube-api-access-tmklp") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "kube-api-access-tmklp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.800002 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts" (OuterVolumeSpecName: "scripts") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.850738 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890518 4894 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890552 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmklp\" (UniqueName: \"kubernetes.io/projected/bb3eae76-6a5e-4a98-895b-32aeadcc201e-kube-api-access-tmklp\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890563 4894 reconciler_common.go:293] "Volume detached for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/bb3eae76-6a5e-4a98-895b-32aeadcc201e-etc-localtime\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890572 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890581 4894 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-scripts\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.890590 4894 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.898360 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data" (OuterVolumeSpecName: "config-data") pod "bb3eae76-6a5e-4a98-895b-32aeadcc201e" (UID: "bb3eae76-6a5e-4a98-895b-32aeadcc201e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.991936 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3eae76-6a5e-4a98-895b-32aeadcc201e-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:01 crc kubenswrapper[4894]: I0613 05:46:01.996640 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kv7x4" Jun 13 05:46:02 crc kubenswrapper[4894]: W0613 05:46:02.028226 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa361d72_15f6_4389_80d4_bab6fe8f096d.slice/crio-595c30c99d52140e61f05130612c0ff26870a198e2898a8ed026cce36ee69f5c WatchSource:0}: Error finding container 595c30c99d52140e61f05130612c0ff26870a198e2898a8ed026cce36ee69f5c: Status 404 returned error can't find the container with id 595c30c99d52140e61f05130612c0ff26870a198e2898a8ed026cce36ee69f5c Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.395500 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0220044c-1440-4924-b660-c2babe3d6acc","Type":"ContainerStarted","Data":"ee0bb76d49da55f2497e649ec49971ef2777aafc461919cd1975d1b1b06c0bf9"} Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.398213 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kv7x4" event={"ID":"aa361d72-15f6-4389-80d4-bab6fe8f096d","Type":"ContainerStarted","Data":"47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093"} Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.398254 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kv7x4" event={"ID":"aa361d72-15f6-4389-80d4-bab6fe8f096d","Type":"ContainerStarted","Data":"595c30c99d52140e61f05130612c0ff26870a198e2898a8ed026cce36ee69f5c"} Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.400946 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"bb3eae76-6a5e-4a98-895b-32aeadcc201e","Type":"ContainerDied","Data":"d5e88d3a04e55820964f28973729a561c4e9f6ebdf4781dc22dcbbf88ea7d947"} Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.401003 4894 scope.go:117] "RemoveContainer" containerID="66099791f95bedfadf1d140237e95d415d663a0af2e0f880cc40f74950d3ac8e" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.401117 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.416953 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.416940043 podStartE2EDuration="3.416940043s" podCreationTimestamp="2025-06-13 05:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:46:02.413922418 +0000 UTC m=+3320.860169881" watchObservedRunningTime="2025-06-13 05:46:02.416940043 +0000 UTC m=+3320.863187506" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.434897 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.457301 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.458884 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-kv7x4" podStartSLOduration=1.458866858 podStartE2EDuration="1.458866858s" podCreationTimestamp="2025-06-13 05:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:46:02.442184447 +0000 UTC m=+3320.888431910" watchObservedRunningTime="2025-06-13 05:46:02.458866858 +0000 UTC m=+3320.905114321" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.470978 4894 scope.go:117] "RemoveContainer" containerID="6b0bfc3b3d4b402f752a461154ffe081dbe98fa77c7fc3d36cafeefe86390b2c" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.525790 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:46:02 crc kubenswrapper[4894]: E0613 05:46:02.526951 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="manila-scheduler" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.528141 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="manila-scheduler" Jun 13 05:46:02 crc kubenswrapper[4894]: E0613 05:46:02.528164 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="probe" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.528171 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="probe" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.528401 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="probe" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.528419 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" containerName="manila-scheduler" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.535805 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.535843 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.537987 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606518 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cdh\" (UniqueName: \"kubernetes.io/projected/1cfe9e73-4d63-499a-b177-6ab1cd56f443-kube-api-access-l4cdh\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606582 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606609 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-scripts\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606633 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606671 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606713 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.606815 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.707947 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cdh\" (UniqueName: \"kubernetes.io/projected/1cfe9e73-4d63-499a-b177-6ab1cd56f443-kube-api-access-l4cdh\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708128 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-scripts\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708201 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708283 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708359 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708501 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708567 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.708594 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.709432 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-localtime\" (UniqueName: \"kubernetes.io/host-path/1cfe9e73-4d63-499a-b177-6ab1cd56f443-etc-localtime\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.714951 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.715521 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-scripts\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.717752 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.718144 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfe9e73-4d63-499a-b177-6ab1cd56f443-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.723529 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cdh\" (UniqueName: \"kubernetes.io/projected/1cfe9e73-4d63-499a-b177-6ab1cd56f443-kube-api-access-l4cdh\") pod \"manila-scheduler-0\" (UID: \"1cfe9e73-4d63-499a-b177-6ab1cd56f443\") " pod="openstack/manila-scheduler-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.812393 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jun 13 05:46:02 crc kubenswrapper[4894]: I0613 05:46:02.854728 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jun 13 05:46:03 crc kubenswrapper[4894]: I0613 05:46:03.385917 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jun 13 05:46:03 crc kubenswrapper[4894]: I0613 05:46:03.438691 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1cfe9e73-4d63-499a-b177-6ab1cd56f443","Type":"ContainerStarted","Data":"d254ae8f2afc3fda72ea0c0cd3ac9a036247d43102736e4d24eef8366aa81936"} Jun 13 05:46:04 crc kubenswrapper[4894]: I0613 05:46:04.289677 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3eae76-6a5e-4a98-895b-32aeadcc201e" path="/var/lib/kubelet/pods/bb3eae76-6a5e-4a98-895b-32aeadcc201e/volumes" Jun 13 05:46:04 crc kubenswrapper[4894]: I0613 05:46:04.450748 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1cfe9e73-4d63-499a-b177-6ab1cd56f443","Type":"ContainerStarted","Data":"09be1339724c448d71ee646d464f25bd7c653b6c6e8a2d446dae43dd95862da1"} Jun 13 05:46:04 crc kubenswrapper[4894]: I0613 05:46:04.450996 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"1cfe9e73-4d63-499a-b177-6ab1cd56f443","Type":"ContainerStarted","Data":"b069eb68aa90783b748314ef04b981c9170c993a368ebd30f41d7e577b822fa6"} Jun 13 05:46:04 crc kubenswrapper[4894]: I0613 05:46:04.487309 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.48728908 podStartE2EDuration="2.48728908s" podCreationTimestamp="2025-06-13 05:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:46:04.480009034 +0000 UTC m=+3322.926256537" watchObservedRunningTime="2025-06-13 05:46:04.48728908 +0000 UTC m=+3322.933536563" Jun 13 05:46:09 crc kubenswrapper[4894]: I0613 05:46:09.811704 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.695165 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-kv7x4"] Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.696061 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-kv7x4" podUID="aa361d72-15f6-4389-80d4-bab6fe8f096d" containerName="container-00" containerID="cri-o://47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093" gracePeriod=2 Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.711803 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-kv7x4"] Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.792429 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kv7x4" Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.855372 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.938686 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host\") pod \"aa361d72-15f6-4389-80d4-bab6fe8f096d\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.938937 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcf2r\" (UniqueName: \"kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r\") pod \"aa361d72-15f6-4389-80d4-bab6fe8f096d\" (UID: \"aa361d72-15f6-4389-80d4-bab6fe8f096d\") " Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.938775 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host" (OuterVolumeSpecName: "host") pod "aa361d72-15f6-4389-80d4-bab6fe8f096d" (UID: "aa361d72-15f6-4389-80d4-bab6fe8f096d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:46:12 crc kubenswrapper[4894]: I0613 05:46:12.946059 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r" (OuterVolumeSpecName: "kube-api-access-vcf2r") pod "aa361d72-15f6-4389-80d4-bab6fe8f096d" (UID: "aa361d72-15f6-4389-80d4-bab6fe8f096d"). InnerVolumeSpecName "kube-api-access-vcf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.041561 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcf2r\" (UniqueName: \"kubernetes.io/projected/aa361d72-15f6-4389-80d4-bab6fe8f096d-kube-api-access-vcf2r\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.041619 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa361d72-15f6-4389-80d4-bab6fe8f096d-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.573319 4894 generic.go:334] "Generic (PLEG): container finished" podID="aa361d72-15f6-4389-80d4-bab6fe8f096d" containerID="47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093" exitCode=0 Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.573405 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kv7x4" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.573413 4894 scope.go:117] "RemoveContainer" containerID="47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.616613 4894 scope.go:117] "RemoveContainer" containerID="47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093" Jun 13 05:46:13 crc kubenswrapper[4894]: E0613 05:46:13.617346 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093\": container with ID starting with 47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093 not found: ID does not exist" containerID="47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093" Jun 13 05:46:13 crc kubenswrapper[4894]: I0613 05:46:13.617397 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093"} err="failed to get container status \"47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093\": rpc error: code = NotFound desc = could not find container \"47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093\": container with ID starting with 47d5153fc907a3cb5b6f7275320437d2b63c289fb33e1a3686bf7d046525a093 not found: ID does not exist" Jun 13 05:46:14 crc kubenswrapper[4894]: I0613 05:46:14.300193 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa361d72-15f6-4389-80d4-bab6fe8f096d" path="/var/lib/kubelet/pods/aa361d72-15f6-4389-80d4-bab6fe8f096d/volumes" Jun 13 05:46:15 crc kubenswrapper[4894]: I0613 05:46:15.915358 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jun 13 05:46:21 crc kubenswrapper[4894]: I0613 05:46:21.226473 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jun 13 05:46:24 crc kubenswrapper[4894]: I0613 05:46:24.387397 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jun 13 05:46:26 crc kubenswrapper[4894]: I0613 05:46:26.236081 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:46:26 crc kubenswrapper[4894]: I0613 05:46:26.236319 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:46:52 crc kubenswrapper[4894]: E0613 05:46:52.393126 4894 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.213:41324->38.102.83.213:40951: read tcp 38.102.83.213:41324->38.102.83.213:40951: read: connection reset by peer Jun 13 05:46:56 crc kubenswrapper[4894]: I0613 05:46:56.236779 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:46:56 crc kubenswrapper[4894]: I0613 05:46:56.237452 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:46:56 crc kubenswrapper[4894]: I0613 05:46:56.237511 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:46:56 crc kubenswrapper[4894]: I0613 05:46:56.238643 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:46:56 crc kubenswrapper[4894]: I0613 05:46:56.238774 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846" gracePeriod=600 Jun 13 05:46:57 crc kubenswrapper[4894]: I0613 05:46:57.062375 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846" exitCode=0 Jun 13 05:46:57 crc kubenswrapper[4894]: I0613 05:46:57.062457 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846"} Jun 13 05:46:57 crc kubenswrapper[4894]: I0613 05:46:57.062878 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d"} Jun 13 05:46:57 crc kubenswrapper[4894]: I0613 05:46:57.062932 4894 scope.go:117] "RemoveContainer" containerID="e641b1835f2c8544f54c49e458346b29d856929661cf5370a736765a98a7902c" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.105562 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-95kqw"] Jun 13 05:47:02 crc kubenswrapper[4894]: E0613 05:47:02.106806 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa361d72-15f6-4389-80d4-bab6fe8f096d" containerName="container-00" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.106829 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa361d72-15f6-4389-80d4-bab6fe8f096d" containerName="container-00" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.107125 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa361d72-15f6-4389-80d4-bab6fe8f096d" containerName="container-00" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.108159 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.113686 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.294669 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xbqt\" (UniqueName: \"kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.295132 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.396935 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.397106 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xbqt\" (UniqueName: \"kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.397155 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.422710 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xbqt\" (UniqueName: \"kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt\") pod \"crc-debug-95kqw\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " pod="openstack/crc-debug-95kqw" Jun 13 05:47:02 crc kubenswrapper[4894]: I0613 05:47:02.438857 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-95kqw" Jun 13 05:47:03 crc kubenswrapper[4894]: I0613 05:47:03.132572 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-95kqw" event={"ID":"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef","Type":"ContainerStarted","Data":"be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19"} Jun 13 05:47:03 crc kubenswrapper[4894]: I0613 05:47:03.133074 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-95kqw" event={"ID":"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef","Type":"ContainerStarted","Data":"23576444929e66557e57c7714f7ac3d90af345a9a0629feed4a3876bc5dfe808"} Jun 13 05:47:03 crc kubenswrapper[4894]: I0613 05:47:03.150595 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-95kqw" podStartSLOduration=1.150581683 podStartE2EDuration="1.150581683s" podCreationTimestamp="2025-06-13 05:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:47:03.148841524 +0000 UTC m=+3381.595089027" watchObservedRunningTime="2025-06-13 05:47:03.150581683 +0000 UTC m=+3381.596829146" Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.980052 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.983007 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.987201 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.987262 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.987285 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jun 13 05:47:11 crc kubenswrapper[4894]: I0613 05:47:11.997799 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.140729 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.141183 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.141450 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.141893 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.142194 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.142583 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xv5k\" (UniqueName: \"kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.142873 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.143091 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.143269 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.247910 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.254795 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.254976 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255107 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255150 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255231 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255263 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255331 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xv5k\" (UniqueName: \"kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.255399 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.257070 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.258077 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.258788 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.258998 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.259693 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.268858 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.276260 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.284389 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.286259 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xv5k\" (UniqueName: \"kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.307634 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.317466 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jun 13 05:47:12 crc kubenswrapper[4894]: I0613 05:47:12.853278 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.050617 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-95kqw"] Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.051132 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-95kqw" podUID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" containerName="container-00" containerID="cri-o://be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19" gracePeriod=2 Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.060036 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-95kqw"] Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.102012 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-95kqw" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.173794 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xbqt\" (UniqueName: \"kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt\") pod \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.173868 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host\") pod \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\" (UID: \"ccb7e0dc-31a8-4201-a897-5c6c8a1765ef\") " Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.174117 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host" (OuterVolumeSpecName: "host") pod "ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" (UID: "ccb7e0dc-31a8-4201-a897-5c6c8a1765ef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.174746 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.180090 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt" (OuterVolumeSpecName: "kube-api-access-9xbqt") pod "ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" (UID: "ccb7e0dc-31a8-4201-a897-5c6c8a1765ef"). InnerVolumeSpecName "kube-api-access-9xbqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.246384 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"357d4a2c-de1e-47c2-8602-9b717b898330","Type":"ContainerStarted","Data":"6c6c7261988d3a873779229864ef15356847a4e40cd30bb16691c1eea13e4969"} Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.250067 4894 generic.go:334] "Generic (PLEG): container finished" podID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" containerID="be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19" exitCode=0 Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.250134 4894 scope.go:117] "RemoveContainer" containerID="be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.250173 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-95kqw" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.276513 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xbqt\" (UniqueName: \"kubernetes.io/projected/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef-kube-api-access-9xbqt\") on node \"crc\" DevicePath \"\"" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.288839 4894 scope.go:117] "RemoveContainer" containerID="be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19" Jun 13 05:47:13 crc kubenswrapper[4894]: E0613 05:47:13.289395 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19\": container with ID starting with be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19 not found: ID does not exist" containerID="be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19" Jun 13 05:47:13 crc kubenswrapper[4894]: I0613 05:47:13.289425 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19"} err="failed to get container status \"be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19\": rpc error: code = NotFound desc = could not find container \"be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19\": container with ID starting with be6ea9c9937dfae52132a8fba23a1d19394528a1f81d29ba230756406753ab19 not found: ID does not exist" Jun 13 05:47:14 crc kubenswrapper[4894]: I0613 05:47:14.285444 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" path="/var/lib/kubelet/pods/ccb7e0dc-31a8-4201-a897-5c6c8a1765ef/volumes" Jun 13 05:47:45 crc kubenswrapper[4894]: E0613 05:47:45.419970 4894 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jun 13 05:47:45 crc kubenswrapper[4894]: E0613 05:47:45.421003 4894 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8xv5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(357d4a2c-de1e-47c2-8602-9b717b898330): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jun 13 05:47:45 crc kubenswrapper[4894]: E0613 05:47:45.422244 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="357d4a2c-de1e-47c2-8602-9b717b898330" Jun 13 05:47:45 crc kubenswrapper[4894]: E0613 05:47:45.574759 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="357d4a2c-de1e-47c2-8602-9b717b898330" Jun 13 05:47:56 crc kubenswrapper[4894]: I0613 05:47:56.783160 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jun 13 05:47:58 crc kubenswrapper[4894]: I0613 05:47:58.709329 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"357d4a2c-de1e-47c2-8602-9b717b898330","Type":"ContainerStarted","Data":"45a1fa75515fe9f6eaa03e92778e99917ba1c3855c5d5252e05050adf1b9d6ba"} Jun 13 05:47:58 crc kubenswrapper[4894]: I0613 05:47:58.725344 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.806490972 podStartE2EDuration="48.725327452s" podCreationTimestamp="2025-06-13 05:47:10 +0000 UTC" firstStartedPulling="2025-06-13 05:47:12.859062489 +0000 UTC m=+3391.305309982" lastFinishedPulling="2025-06-13 05:47:56.777898989 +0000 UTC m=+3435.224146462" observedRunningTime="2025-06-13 05:47:58.723934293 +0000 UTC m=+3437.170181776" watchObservedRunningTime="2025-06-13 05:47:58.725327452 +0000 UTC m=+3437.171574915" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.480931 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-kl4qj"] Jun 13 05:48:01 crc kubenswrapper[4894]: E0613 05:48:01.481823 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" containerName="container-00" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.481838 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" containerName="container-00" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.482080 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb7e0dc-31a8-4201-a897-5c6c8a1765ef" containerName="container-00" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.482739 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.619706 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnm4\" (UniqueName: \"kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.620142 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.721931 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.722338 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnm4\" (UniqueName: \"kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.722195 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.763510 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnm4\" (UniqueName: \"kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4\") pod \"crc-debug-kl4qj\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: I0613 05:48:01.799340 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kl4qj" Jun 13 05:48:01 crc kubenswrapper[4894]: W0613 05:48:01.831602 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode402bb27_737f_42bb_8e8a_14e657c2dd3a.slice/crio-52c83c4ed8c36c9d767f0e1bcc6ae57e1a905edb7c2dea089f03a0e5201811c1 WatchSource:0}: Error finding container 52c83c4ed8c36c9d767f0e1bcc6ae57e1a905edb7c2dea089f03a0e5201811c1: Status 404 returned error can't find the container with id 52c83c4ed8c36c9d767f0e1bcc6ae57e1a905edb7c2dea089f03a0e5201811c1 Jun 13 05:48:02 crc kubenswrapper[4894]: I0613 05:48:02.743215 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kl4qj" event={"ID":"e402bb27-737f-42bb-8e8a-14e657c2dd3a","Type":"ContainerStarted","Data":"43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e"} Jun 13 05:48:02 crc kubenswrapper[4894]: I0613 05:48:02.743497 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kl4qj" event={"ID":"e402bb27-737f-42bb-8e8a-14e657c2dd3a","Type":"ContainerStarted","Data":"52c83c4ed8c36c9d767f0e1bcc6ae57e1a905edb7c2dea089f03a0e5201811c1"} Jun 13 05:48:02 crc kubenswrapper[4894]: I0613 05:48:02.769170 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-kl4qj" podStartSLOduration=1.769147818 podStartE2EDuration="1.769147818s" podCreationTimestamp="2025-06-13 05:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:48:02.758205098 +0000 UTC m=+3441.204452611" watchObservedRunningTime="2025-06-13 05:48:02.769147818 +0000 UTC m=+3441.215395321" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.585518 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-kl4qj"] Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.586466 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-kl4qj" podUID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" containerName="container-00" containerID="cri-o://43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e" gracePeriod=2 Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.600948 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-kl4qj"] Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.677269 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kl4qj" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.757084 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnm4\" (UniqueName: \"kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4\") pod \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.757263 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host\") pod \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\" (UID: \"e402bb27-737f-42bb-8e8a-14e657c2dd3a\") " Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.757505 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host" (OuterVolumeSpecName: "host") pod "e402bb27-737f-42bb-8e8a-14e657c2dd3a" (UID: "e402bb27-737f-42bb-8e8a-14e657c2dd3a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.758330 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e402bb27-737f-42bb-8e8a-14e657c2dd3a-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.763792 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4" (OuterVolumeSpecName: "kube-api-access-hxnm4") pod "e402bb27-737f-42bb-8e8a-14e657c2dd3a" (UID: "e402bb27-737f-42bb-8e8a-14e657c2dd3a"). InnerVolumeSpecName "kube-api-access-hxnm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.859451 4894 generic.go:334] "Generic (PLEG): container finished" podID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" containerID="43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e" exitCode=0 Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.859526 4894 scope.go:117] "RemoveContainer" containerID="43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.859546 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kl4qj" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.860875 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnm4\" (UniqueName: \"kubernetes.io/projected/e402bb27-737f-42bb-8e8a-14e657c2dd3a-kube-api-access-hxnm4\") on node \"crc\" DevicePath \"\"" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.892456 4894 scope.go:117] "RemoveContainer" containerID="43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e" Jun 13 05:48:12 crc kubenswrapper[4894]: E0613 05:48:12.893178 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e\": container with ID starting with 43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e not found: ID does not exist" containerID="43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e" Jun 13 05:48:12 crc kubenswrapper[4894]: I0613 05:48:12.893274 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e"} err="failed to get container status \"43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e\": rpc error: code = NotFound desc = could not find container \"43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e\": container with ID starting with 43fd8c1fb81592244ba69b450d5d7cb9c99f7b62b62366e255ccec95d378f36e not found: ID does not exist" Jun 13 05:48:14 crc kubenswrapper[4894]: I0613 05:48:14.294878 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" path="/var/lib/kubelet/pods/e402bb27-737f-42bb-8e8a-14e657c2dd3a/volumes" Jun 13 05:48:18 crc kubenswrapper[4894]: I0613 05:48:18.022969 4894 scope.go:117] "RemoveContainer" containerID="e2014cb54eac7b02c6ebdafc8643e29cc5ed59f809c7847268ca4b4670218918" Jun 13 05:48:56 crc kubenswrapper[4894]: I0613 05:48:56.237120 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:48:56 crc kubenswrapper[4894]: I0613 05:48:56.238053 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.027066 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-n5h2b"] Jun 13 05:49:02 crc kubenswrapper[4894]: E0613 05:49:02.027906 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" containerName="container-00" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.027974 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" containerName="container-00" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.028173 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="e402bb27-737f-42bb-8e8a-14e657c2dd3a" containerName="container-00" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.028745 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.129604 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.129762 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxpq\" (UniqueName: \"kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.233166 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.233274 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxpq\" (UniqueName: \"kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.233632 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.257313 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxpq\" (UniqueName: \"kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq\") pod \"crc-debug-n5h2b\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.351053 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-n5h2b" Jun 13 05:49:02 crc kubenswrapper[4894]: I0613 05:49:02.459013 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-n5h2b" event={"ID":"2a3bdc86-2d03-4cb7-a222-17a1e188a044","Type":"ContainerStarted","Data":"eaa9f5a05259b0e08fb079e830a1e0b0ca07966ede8e62e4d286723ffa6d9489"} Jun 13 05:49:03 crc kubenswrapper[4894]: I0613 05:49:03.474105 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-n5h2b" event={"ID":"2a3bdc86-2d03-4cb7-a222-17a1e188a044","Type":"ContainerStarted","Data":"d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873"} Jun 13 05:49:03 crc kubenswrapper[4894]: I0613 05:49:03.522614 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-n5h2b" podStartSLOduration=1.5225803789999999 podStartE2EDuration="1.522580379s" podCreationTimestamp="2025-06-13 05:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:49:03.509103358 +0000 UTC m=+3501.955350831" watchObservedRunningTime="2025-06-13 05:49:03.522580379 +0000 UTC m=+3501.968827872" Jun 13 05:49:12 crc kubenswrapper[4894]: I0613 05:49:12.916994 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-n5h2b"] Jun 13 05:49:12 crc kubenswrapper[4894]: I0613 05:49:12.917737 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-n5h2b" podUID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" containerName="container-00" containerID="cri-o://d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873" gracePeriod=2 Jun 13 05:49:12 crc kubenswrapper[4894]: I0613 05:49:12.935094 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-n5h2b"] Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.015129 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-n5h2b" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.179819 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbxpq\" (UniqueName: \"kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq\") pod \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.179942 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host\") pod \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\" (UID: \"2a3bdc86-2d03-4cb7-a222-17a1e188a044\") " Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.180167 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host" (OuterVolumeSpecName: "host") pod "2a3bdc86-2d03-4cb7-a222-17a1e188a044" (UID: "2a3bdc86-2d03-4cb7-a222-17a1e188a044"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.180789 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a3bdc86-2d03-4cb7-a222-17a1e188a044-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.193935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq" (OuterVolumeSpecName: "kube-api-access-dbxpq") pod "2a3bdc86-2d03-4cb7-a222-17a1e188a044" (UID: "2a3bdc86-2d03-4cb7-a222-17a1e188a044"). InnerVolumeSpecName "kube-api-access-dbxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.285270 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbxpq\" (UniqueName: \"kubernetes.io/projected/2a3bdc86-2d03-4cb7-a222-17a1e188a044-kube-api-access-dbxpq\") on node \"crc\" DevicePath \"\"" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.581260 4894 generic.go:334] "Generic (PLEG): container finished" podID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" containerID="d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873" exitCode=0 Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.581349 4894 scope.go:117] "RemoveContainer" containerID="d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.581412 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-n5h2b" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.611541 4894 scope.go:117] "RemoveContainer" containerID="d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873" Jun 13 05:49:13 crc kubenswrapper[4894]: E0613 05:49:13.612214 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873\": container with ID starting with d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873 not found: ID does not exist" containerID="d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873" Jun 13 05:49:13 crc kubenswrapper[4894]: I0613 05:49:13.612263 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873"} err="failed to get container status \"d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873\": rpc error: code = NotFound desc = could not find container \"d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873\": container with ID starting with d7d7924e3364ad0dce1a98c3b2d2e7bfa33a979accd23613992b63ec53e07873 not found: ID does not exist" Jun 13 05:49:14 crc kubenswrapper[4894]: I0613 05:49:14.288459 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" path="/var/lib/kubelet/pods/2a3bdc86-2d03-4cb7-a222-17a1e188a044/volumes" Jun 13 05:49:26 crc kubenswrapper[4894]: I0613 05:49:26.236115 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:49:26 crc kubenswrapper[4894]: I0613 05:49:26.236606 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:49:56 crc kubenswrapper[4894]: I0613 05:49:56.236687 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:49:56 crc kubenswrapper[4894]: I0613 05:49:56.237716 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:49:56 crc kubenswrapper[4894]: I0613 05:49:56.237810 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:49:56 crc kubenswrapper[4894]: I0613 05:49:56.239137 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:49:56 crc kubenswrapper[4894]: I0613 05:49:56.239265 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" gracePeriod=600 Jun 13 05:49:56 crc kubenswrapper[4894]: E0613 05:49:56.387511 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:49:57 crc kubenswrapper[4894]: I0613 05:49:57.036628 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" exitCode=0 Jun 13 05:49:57 crc kubenswrapper[4894]: I0613 05:49:57.036688 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d"} Jun 13 05:49:57 crc kubenswrapper[4894]: I0613 05:49:57.036721 4894 scope.go:117] "RemoveContainer" containerID="aa5f683f906a9dbd4da249cc50ffcfc60e01c6f3f34a0912f2675d2dced21846" Jun 13 05:49:57 crc kubenswrapper[4894]: I0613 05:49:57.037828 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:49:57 crc kubenswrapper[4894]: E0613 05:49:57.038333 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.378712 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-rgc7s"] Jun 13 05:50:02 crc kubenswrapper[4894]: E0613 05:50:02.379841 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" containerName="container-00" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.379856 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" containerName="container-00" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.380055 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a3bdc86-2d03-4cb7-a222-17a1e188a044" containerName="container-00" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.380885 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.496039 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl7g\" (UniqueName: \"kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.496613 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.599859 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl7g\" (UniqueName: \"kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.600021 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.600290 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.633334 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl7g\" (UniqueName: \"kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g\") pod \"crc-debug-rgc7s\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " pod="openstack/crc-debug-rgc7s" Jun 13 05:50:02 crc kubenswrapper[4894]: I0613 05:50:02.711413 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rgc7s" Jun 13 05:50:03 crc kubenswrapper[4894]: I0613 05:50:03.116017 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-rgc7s" event={"ID":"14ce164a-6172-4ce4-8c63-2482598a72ba","Type":"ContainerStarted","Data":"31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f"} Jun 13 05:50:03 crc kubenswrapper[4894]: I0613 05:50:03.116573 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-rgc7s" event={"ID":"14ce164a-6172-4ce4-8c63-2482598a72ba","Type":"ContainerStarted","Data":"30ff706caa38169d92d899b782f35ceb58891c8f83e3cdfb897f9c18264bc38e"} Jun 13 05:50:03 crc kubenswrapper[4894]: I0613 05:50:03.145916 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-rgc7s" podStartSLOduration=1.145886779 podStartE2EDuration="1.145886779s" podCreationTimestamp="2025-06-13 05:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:50:03.137153112 +0000 UTC m=+3561.583400615" watchObservedRunningTime="2025-06-13 05:50:03.145886779 +0000 UTC m=+3561.592134272" Jun 13 05:50:07 crc kubenswrapper[4894]: I0613 05:50:07.277594 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:50:07 crc kubenswrapper[4894]: E0613 05:50:07.278427 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:50:13 crc kubenswrapper[4894]: E0613 05:50:13.328260 4894 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.213:47466->38.102.83.213:40951: write tcp 38.102.83.213:47466->38.102.83.213:40951: write: broken pipe Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.358727 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-rgc7s"] Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.359067 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-rgc7s" podUID="14ce164a-6172-4ce4-8c63-2482598a72ba" containerName="container-00" containerID="cri-o://31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f" gracePeriod=2 Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.369381 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-rgc7s"] Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.474496 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rgc7s" Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.574358 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cl7g\" (UniqueName: \"kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g\") pod \"14ce164a-6172-4ce4-8c63-2482598a72ba\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.574503 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host\") pod \"14ce164a-6172-4ce4-8c63-2482598a72ba\" (UID: \"14ce164a-6172-4ce4-8c63-2482598a72ba\") " Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.574734 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host" (OuterVolumeSpecName: "host") pod "14ce164a-6172-4ce4-8c63-2482598a72ba" (UID: "14ce164a-6172-4ce4-8c63-2482598a72ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.575234 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14ce164a-6172-4ce4-8c63-2482598a72ba-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.580321 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g" (OuterVolumeSpecName: "kube-api-access-5cl7g") pod "14ce164a-6172-4ce4-8c63-2482598a72ba" (UID: "14ce164a-6172-4ce4-8c63-2482598a72ba"). InnerVolumeSpecName "kube-api-access-5cl7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:50:13 crc kubenswrapper[4894]: I0613 05:50:13.676547 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cl7g\" (UniqueName: \"kubernetes.io/projected/14ce164a-6172-4ce4-8c63-2482598a72ba-kube-api-access-5cl7g\") on node \"crc\" DevicePath \"\"" Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.263105 4894 generic.go:334] "Generic (PLEG): container finished" podID="14ce164a-6172-4ce4-8c63-2482598a72ba" containerID="31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f" exitCode=0 Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.263189 4894 scope.go:117] "RemoveContainer" containerID="31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f" Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.263203 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rgc7s" Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.295585 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ce164a-6172-4ce4-8c63-2482598a72ba" path="/var/lib/kubelet/pods/14ce164a-6172-4ce4-8c63-2482598a72ba/volumes" Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.298784 4894 scope.go:117] "RemoveContainer" containerID="31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f" Jun 13 05:50:14 crc kubenswrapper[4894]: E0613 05:50:14.299447 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f\": container with ID starting with 31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f not found: ID does not exist" containerID="31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f" Jun 13 05:50:14 crc kubenswrapper[4894]: I0613 05:50:14.299504 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f"} err="failed to get container status \"31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f\": rpc error: code = NotFound desc = could not find container \"31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f\": container with ID starting with 31748a385610089889abd16424c0036baebb3d201c8ecc9b1725e5e95e12773f not found: ID does not exist" Jun 13 05:50:19 crc kubenswrapper[4894]: I0613 05:50:19.277491 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:50:19 crc kubenswrapper[4894]: E0613 05:50:19.280097 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:50:33 crc kubenswrapper[4894]: I0613 05:50:33.277227 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:50:33 crc kubenswrapper[4894]: E0613 05:50:33.278038 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:50:48 crc kubenswrapper[4894]: I0613 05:50:48.278434 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:50:48 crc kubenswrapper[4894]: E0613 05:50:48.279850 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.764863 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-wbjtz"] Jun 13 05:51:01 crc kubenswrapper[4894]: E0613 05:51:01.765991 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce164a-6172-4ce4-8c63-2482598a72ba" containerName="container-00" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.766014 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce164a-6172-4ce4-8c63-2482598a72ba" containerName="container-00" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.766363 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce164a-6172-4ce4-8c63-2482598a72ba" containerName="container-00" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.767366 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.868309 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwwn7\" (UniqueName: \"kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.868552 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.970333 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwwn7\" (UniqueName: \"kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.970495 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.970749 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:01 crc kubenswrapper[4894]: I0613 05:51:01.994902 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwwn7\" (UniqueName: \"kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7\") pod \"crc-debug-wbjtz\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " pod="openstack/crc-debug-wbjtz" Jun 13 05:51:02 crc kubenswrapper[4894]: I0613 05:51:02.105159 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-wbjtz" Jun 13 05:51:02 crc kubenswrapper[4894]: I0613 05:51:02.774943 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-wbjtz" event={"ID":"98d49451-7ea9-4033-ae86-912025bcae4b","Type":"ContainerStarted","Data":"bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21"} Jun 13 05:51:02 crc kubenswrapper[4894]: I0613 05:51:02.775876 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-wbjtz" event={"ID":"98d49451-7ea9-4033-ae86-912025bcae4b","Type":"ContainerStarted","Data":"74fae5e02972c1f9266e76f93dea0f9fa0a0c3d6641573e2dd0e391116fa229e"} Jun 13 05:51:02 crc kubenswrapper[4894]: I0613 05:51:02.792094 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-wbjtz" podStartSLOduration=1.792076163 podStartE2EDuration="1.792076163s" podCreationTimestamp="2025-06-13 05:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:51:02.79019015 +0000 UTC m=+3621.236437613" watchObservedRunningTime="2025-06-13 05:51:02.792076163 +0000 UTC m=+3621.238323626" Jun 13 05:51:03 crc kubenswrapper[4894]: I0613 05:51:03.276876 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:51:03 crc kubenswrapper[4894]: E0613 05:51:03.277155 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.726896 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-wbjtz"] Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.727677 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-wbjtz" podUID="98d49451-7ea9-4033-ae86-912025bcae4b" containerName="container-00" containerID="cri-o://bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21" gracePeriod=2 Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.736021 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-wbjtz"] Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.800148 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-wbjtz" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.879313 4894 generic.go:334] "Generic (PLEG): container finished" podID="98d49451-7ea9-4033-ae86-912025bcae4b" containerID="bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21" exitCode=0 Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.879374 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-wbjtz" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.879387 4894 scope.go:117] "RemoveContainer" containerID="bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.894342 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host\") pod \"98d49451-7ea9-4033-ae86-912025bcae4b\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.894456 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host" (OuterVolumeSpecName: "host") pod "98d49451-7ea9-4033-ae86-912025bcae4b" (UID: "98d49451-7ea9-4033-ae86-912025bcae4b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.894534 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwwn7\" (UniqueName: \"kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7\") pod \"98d49451-7ea9-4033-ae86-912025bcae4b\" (UID: \"98d49451-7ea9-4033-ae86-912025bcae4b\") " Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.894942 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98d49451-7ea9-4033-ae86-912025bcae4b-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.899822 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7" (OuterVolumeSpecName: "kube-api-access-dwwn7") pod "98d49451-7ea9-4033-ae86-912025bcae4b" (UID: "98d49451-7ea9-4033-ae86-912025bcae4b"). InnerVolumeSpecName "kube-api-access-dwwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.914694 4894 scope.go:117] "RemoveContainer" containerID="bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21" Jun 13 05:51:12 crc kubenswrapper[4894]: E0613 05:51:12.915280 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21\": container with ID starting with bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21 not found: ID does not exist" containerID="bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.915316 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21"} err="failed to get container status \"bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21\": rpc error: code = NotFound desc = could not find container \"bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21\": container with ID starting with bc3967ed3aa1b3ff5bce2fba085d059524e7299128f0278607534586c1fa8d21 not found: ID does not exist" Jun 13 05:51:12 crc kubenswrapper[4894]: I0613 05:51:12.996251 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwwn7\" (UniqueName: \"kubernetes.io/projected/98d49451-7ea9-4033-ae86-912025bcae4b-kube-api-access-dwwn7\") on node \"crc\" DevicePath \"\"" Jun 13 05:51:14 crc kubenswrapper[4894]: I0613 05:51:14.289214 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d49451-7ea9-4033-ae86-912025bcae4b" path="/var/lib/kubelet/pods/98d49451-7ea9-4033-ae86-912025bcae4b/volumes" Jun 13 05:51:16 crc kubenswrapper[4894]: I0613 05:51:16.277154 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:51:16 crc kubenswrapper[4894]: E0613 05:51:16.277917 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:31 crc kubenswrapper[4894]: I0613 05:51:31.277491 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:51:31 crc kubenswrapper[4894]: E0613 05:51:31.278567 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:42 crc kubenswrapper[4894]: I0613 05:51:42.287947 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:51:42 crc kubenswrapper[4894]: E0613 05:51:42.288534 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:54 crc kubenswrapper[4894]: I0613 05:51:54.277983 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:51:54 crc kubenswrapper[4894]: E0613 05:51:54.278728 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.710333 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:51:59 crc kubenswrapper[4894]: E0613 05:51:59.711280 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d49451-7ea9-4033-ae86-912025bcae4b" containerName="container-00" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.711294 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d49451-7ea9-4033-ae86-912025bcae4b" containerName="container-00" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.711512 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d49451-7ea9-4033-ae86-912025bcae4b" containerName="container-00" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.712888 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.729513 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.802236 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrlxb\" (UniqueName: \"kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.802323 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.802706 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.904761 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrlxb\" (UniqueName: \"kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.905089 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.905229 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.905565 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.905637 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:51:59 crc kubenswrapper[4894]: I0613 05:51:59.938680 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrlxb\" (UniqueName: \"kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb\") pod \"redhat-marketplace-7kdx2\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:00 crc kubenswrapper[4894]: I0613 05:52:00.065127 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:00 crc kubenswrapper[4894]: I0613 05:52:00.780449 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:52:01 crc kubenswrapper[4894]: I0613 05:52:01.405013 4894 generic.go:334] "Generic (PLEG): container finished" podID="b35b27ce-aa12-4753-9680-bf4654291081" containerID="f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45" exitCode=0 Jun 13 05:52:01 crc kubenswrapper[4894]: I0613 05:52:01.405279 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerDied","Data":"f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45"} Jun 13 05:52:01 crc kubenswrapper[4894]: I0613 05:52:01.405312 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerStarted","Data":"92a0beb0424535d847f0cd10d770e9e1d70d9abea587c0f26269f8506d33449b"} Jun 13 05:52:01 crc kubenswrapper[4894]: I0613 05:52:01.407476 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.160834 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-rjprx"] Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.162445 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.268051 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdb2n\" (UniqueName: \"kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.268174 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.370597 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdb2n\" (UniqueName: \"kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.370731 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.371101 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.396316 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdb2n\" (UniqueName: \"kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n\") pod \"crc-debug-rjprx\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " pod="openstack/crc-debug-rjprx" Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.414285 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerStarted","Data":"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f"} Jun 13 05:52:02 crc kubenswrapper[4894]: I0613 05:52:02.491650 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rjprx" Jun 13 05:52:03 crc kubenswrapper[4894]: I0613 05:52:03.427052 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-rjprx" event={"ID":"cf01611d-995f-492b-bf40-4a8ec6eb261d","Type":"ContainerStarted","Data":"b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01"} Jun 13 05:52:03 crc kubenswrapper[4894]: I0613 05:52:03.427521 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-rjprx" event={"ID":"cf01611d-995f-492b-bf40-4a8ec6eb261d","Type":"ContainerStarted","Data":"76cbd21ff925dbce6789a04e195d29d26e1aaa3a8516a2d68dd00a8c7d4e8a30"} Jun 13 05:52:03 crc kubenswrapper[4894]: I0613 05:52:03.449963 4894 generic.go:334] "Generic (PLEG): container finished" podID="b35b27ce-aa12-4753-9680-bf4654291081" containerID="d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f" exitCode=0 Jun 13 05:52:03 crc kubenswrapper[4894]: I0613 05:52:03.450018 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerDied","Data":"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f"} Jun 13 05:52:03 crc kubenswrapper[4894]: I0613 05:52:03.492730 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-rjprx" podStartSLOduration=1.492711962 podStartE2EDuration="1.492711962s" podCreationTimestamp="2025-06-13 05:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:52:03.462113037 +0000 UTC m=+3681.908360500" watchObservedRunningTime="2025-06-13 05:52:03.492711962 +0000 UTC m=+3681.938959425" Jun 13 05:52:04 crc kubenswrapper[4894]: I0613 05:52:04.464365 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerStarted","Data":"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3"} Jun 13 05:52:04 crc kubenswrapper[4894]: I0613 05:52:04.481467 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7kdx2" podStartSLOduration=2.747476154 podStartE2EDuration="5.481452688s" podCreationTimestamp="2025-06-13 05:51:59 +0000 UTC" firstStartedPulling="2025-06-13 05:52:01.407079483 +0000 UTC m=+3679.853326986" lastFinishedPulling="2025-06-13 05:52:04.141056037 +0000 UTC m=+3682.587303520" observedRunningTime="2025-06-13 05:52:04.480520922 +0000 UTC m=+3682.926768385" watchObservedRunningTime="2025-06-13 05:52:04.481452688 +0000 UTC m=+3682.927700151" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.082207 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.087135 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.132110 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.170836 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.170961 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.171015 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj4hm\" (UniqueName: \"kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.272222 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.272319 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj4hm\" (UniqueName: \"kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.272402 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.272984 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.273254 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.308772 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj4hm\" (UniqueName: \"kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm\") pod \"certified-operators-xnm6c\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:07 crc kubenswrapper[4894]: I0613 05:52:07.426323 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:08 crc kubenswrapper[4894]: I0613 05:52:08.024529 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:08 crc kubenswrapper[4894]: I0613 05:52:08.276965 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:52:08 crc kubenswrapper[4894]: E0613 05:52:08.277391 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:52:08 crc kubenswrapper[4894]: I0613 05:52:08.502464 4894 generic.go:334] "Generic (PLEG): container finished" podID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerID="4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2" exitCode=0 Jun 13 05:52:08 crc kubenswrapper[4894]: I0613 05:52:08.502512 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerDied","Data":"4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2"} Jun 13 05:52:08 crc kubenswrapper[4894]: I0613 05:52:08.502573 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerStarted","Data":"6fb838691b69583b86aecf75484d1391f5093f21982577f03235fc8fda159dac"} Jun 13 05:52:09 crc kubenswrapper[4894]: I0613 05:52:09.513718 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerStarted","Data":"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6"} Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.066498 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.066562 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.134300 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.523301 4894 generic.go:334] "Generic (PLEG): container finished" podID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerID="256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6" exitCode=0 Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.523362 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerDied","Data":"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6"} Jun 13 05:52:10 crc kubenswrapper[4894]: I0613 05:52:10.589825 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:11 crc kubenswrapper[4894]: I0613 05:52:11.546437 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerStarted","Data":"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4"} Jun 13 05:52:12 crc kubenswrapper[4894]: I0613 05:52:12.472451 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xnm6c" podStartSLOduration=3.054962602 podStartE2EDuration="5.472423009s" podCreationTimestamp="2025-06-13 05:52:07 +0000 UTC" firstStartedPulling="2025-06-13 05:52:08.507119422 +0000 UTC m=+3686.953366925" lastFinishedPulling="2025-06-13 05:52:10.924579869 +0000 UTC m=+3689.370827332" observedRunningTime="2025-06-13 05:52:11.605027702 +0000 UTC m=+3690.051275165" watchObservedRunningTime="2025-06-13 05:52:12.472423009 +0000 UTC m=+3690.918670502" Jun 13 05:52:12 crc kubenswrapper[4894]: I0613 05:52:12.482840 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:52:12 crc kubenswrapper[4894]: I0613 05:52:12.555419 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7kdx2" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="registry-server" containerID="cri-o://304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3" gracePeriod=2 Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.064129 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-rjprx"] Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.064557 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-rjprx" podUID="cf01611d-995f-492b-bf40-4a8ec6eb261d" containerName="container-00" containerID="cri-o://b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01" gracePeriod=2 Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.075569 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-rjprx"] Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.162109 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.166929 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rjprx" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310459 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities\") pod \"b35b27ce-aa12-4753-9680-bf4654291081\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310500 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdb2n\" (UniqueName: \"kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n\") pod \"cf01611d-995f-492b-bf40-4a8ec6eb261d\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310529 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrlxb\" (UniqueName: \"kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb\") pod \"b35b27ce-aa12-4753-9680-bf4654291081\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310648 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host\") pod \"cf01611d-995f-492b-bf40-4a8ec6eb261d\" (UID: \"cf01611d-995f-492b-bf40-4a8ec6eb261d\") " Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310778 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content\") pod \"b35b27ce-aa12-4753-9680-bf4654291081\" (UID: \"b35b27ce-aa12-4753-9680-bf4654291081\") " Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.310946 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host" (OuterVolumeSpecName: "host") pod "cf01611d-995f-492b-bf40-4a8ec6eb261d" (UID: "cf01611d-995f-492b-bf40-4a8ec6eb261d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.311335 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf01611d-995f-492b-bf40-4a8ec6eb261d-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.311360 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities" (OuterVolumeSpecName: "utilities") pod "b35b27ce-aa12-4753-9680-bf4654291081" (UID: "b35b27ce-aa12-4753-9680-bf4654291081"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.317222 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b35b27ce-aa12-4753-9680-bf4654291081" (UID: "b35b27ce-aa12-4753-9680-bf4654291081"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.319123 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n" (OuterVolumeSpecName: "kube-api-access-bdb2n") pod "cf01611d-995f-492b-bf40-4a8ec6eb261d" (UID: "cf01611d-995f-492b-bf40-4a8ec6eb261d"). InnerVolumeSpecName "kube-api-access-bdb2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.319511 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb" (OuterVolumeSpecName: "kube-api-access-nrlxb") pod "b35b27ce-aa12-4753-9680-bf4654291081" (UID: "b35b27ce-aa12-4753-9680-bf4654291081"). InnerVolumeSpecName "kube-api-access-nrlxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.412823 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.412854 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35b27ce-aa12-4753-9680-bf4654291081-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.412864 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdb2n\" (UniqueName: \"kubernetes.io/projected/cf01611d-995f-492b-bf40-4a8ec6eb261d-kube-api-access-bdb2n\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.412874 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrlxb\" (UniqueName: \"kubernetes.io/projected/b35b27ce-aa12-4753-9680-bf4654291081-kube-api-access-nrlxb\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.566695 4894 generic.go:334] "Generic (PLEG): container finished" podID="cf01611d-995f-492b-bf40-4a8ec6eb261d" containerID="b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01" exitCode=0 Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.566755 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-rjprx" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.566797 4894 scope.go:117] "RemoveContainer" containerID="b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.572701 4894 generic.go:334] "Generic (PLEG): container finished" podID="b35b27ce-aa12-4753-9680-bf4654291081" containerID="304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3" exitCode=0 Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.572739 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerDied","Data":"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3"} Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.572764 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdx2" event={"ID":"b35b27ce-aa12-4753-9680-bf4654291081","Type":"ContainerDied","Data":"92a0beb0424535d847f0cd10d770e9e1d70d9abea587c0f26269f8506d33449b"} Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.572796 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdx2" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.597238 4894 scope.go:117] "RemoveContainer" containerID="b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01" Jun 13 05:52:13 crc kubenswrapper[4894]: E0613 05:52:13.598026 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01\": container with ID starting with b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01 not found: ID does not exist" containerID="b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.598096 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01"} err="failed to get container status \"b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01\": rpc error: code = NotFound desc = could not find container \"b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01\": container with ID starting with b9b9175c60c3abd713c16bdb9afb3f1a63f37a5360e32fa1853a9c8802057b01 not found: ID does not exist" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.598139 4894 scope.go:117] "RemoveContainer" containerID="304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.613356 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.625788 4894 scope.go:117] "RemoveContainer" containerID="d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.626790 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdx2"] Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.653854 4894 scope.go:117] "RemoveContainer" containerID="f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.679286 4894 scope.go:117] "RemoveContainer" containerID="304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3" Jun 13 05:52:13 crc kubenswrapper[4894]: E0613 05:52:13.683312 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3\": container with ID starting with 304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3 not found: ID does not exist" containerID="304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.683341 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3"} err="failed to get container status \"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3\": rpc error: code = NotFound desc = could not find container \"304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3\": container with ID starting with 304ef2f764883f8f02dc6700e6d649019f60b69e5b8c30aa1ac15eb4ef585fa3 not found: ID does not exist" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.683369 4894 scope.go:117] "RemoveContainer" containerID="d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f" Jun 13 05:52:13 crc kubenswrapper[4894]: E0613 05:52:13.684009 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f\": container with ID starting with d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f not found: ID does not exist" containerID="d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.684054 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f"} err="failed to get container status \"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f\": rpc error: code = NotFound desc = could not find container \"d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f\": container with ID starting with d21fab62120e36cf44694a1d8a9fbef0bf61a17cbeffa14515eb94f83daa411f not found: ID does not exist" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.684084 4894 scope.go:117] "RemoveContainer" containerID="f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45" Jun 13 05:52:13 crc kubenswrapper[4894]: E0613 05:52:13.684362 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45\": container with ID starting with f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45 not found: ID does not exist" containerID="f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45" Jun 13 05:52:13 crc kubenswrapper[4894]: I0613 05:52:13.684391 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45"} err="failed to get container status \"f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45\": rpc error: code = NotFound desc = could not find container \"f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45\": container with ID starting with f63c220a5507891b743b63661ddc9e278a0ae283fb1b0bf34304aa2235addf45 not found: ID does not exist" Jun 13 05:52:14 crc kubenswrapper[4894]: I0613 05:52:14.289066 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35b27ce-aa12-4753-9680-bf4654291081" path="/var/lib/kubelet/pods/b35b27ce-aa12-4753-9680-bf4654291081/volumes" Jun 13 05:52:14 crc kubenswrapper[4894]: I0613 05:52:14.290464 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf01611d-995f-492b-bf40-4a8ec6eb261d" path="/var/lib/kubelet/pods/cf01611d-995f-492b-bf40-4a8ec6eb261d/volumes" Jun 13 05:52:17 crc kubenswrapper[4894]: I0613 05:52:17.426923 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:17 crc kubenswrapper[4894]: I0613 05:52:17.427580 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:17 crc kubenswrapper[4894]: I0613 05:52:17.476481 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:17 crc kubenswrapper[4894]: I0613 05:52:17.711006 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:17 crc kubenswrapper[4894]: I0613 05:52:17.778755 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:19 crc kubenswrapper[4894]: I0613 05:52:19.658149 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xnm6c" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="registry-server" containerID="cri-o://7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4" gracePeriod=2 Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.210020 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.365248 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj4hm\" (UniqueName: \"kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm\") pod \"efb565ad-7312-4657-bc7b-2f86bc314d14\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.365736 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content\") pod \"efb565ad-7312-4657-bc7b-2f86bc314d14\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.365824 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities\") pod \"efb565ad-7312-4657-bc7b-2f86bc314d14\" (UID: \"efb565ad-7312-4657-bc7b-2f86bc314d14\") " Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.366815 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities" (OuterVolumeSpecName: "utilities") pod "efb565ad-7312-4657-bc7b-2f86bc314d14" (UID: "efb565ad-7312-4657-bc7b-2f86bc314d14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.373802 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm" (OuterVolumeSpecName: "kube-api-access-gj4hm") pod "efb565ad-7312-4657-bc7b-2f86bc314d14" (UID: "efb565ad-7312-4657-bc7b-2f86bc314d14"). InnerVolumeSpecName "kube-api-access-gj4hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.398778 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb565ad-7312-4657-bc7b-2f86bc314d14" (UID: "efb565ad-7312-4657-bc7b-2f86bc314d14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.467472 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.467504 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb565ad-7312-4657-bc7b-2f86bc314d14-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.467515 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj4hm\" (UniqueName: \"kubernetes.io/projected/efb565ad-7312-4657-bc7b-2f86bc314d14-kube-api-access-gj4hm\") on node \"crc\" DevicePath \"\"" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.669535 4894 generic.go:334] "Generic (PLEG): container finished" podID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerID="7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4" exitCode=0 Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.669575 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerDied","Data":"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4"} Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.669606 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnm6c" event={"ID":"efb565ad-7312-4657-bc7b-2f86bc314d14","Type":"ContainerDied","Data":"6fb838691b69583b86aecf75484d1391f5093f21982577f03235fc8fda159dac"} Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.669647 4894 scope.go:117] "RemoveContainer" containerID="7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.669740 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnm6c" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.718864 4894 scope.go:117] "RemoveContainer" containerID="256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.728023 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.738879 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xnm6c"] Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.746049 4894 scope.go:117] "RemoveContainer" containerID="4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.788312 4894 scope.go:117] "RemoveContainer" containerID="7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4" Jun 13 05:52:20 crc kubenswrapper[4894]: E0613 05:52:20.788957 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4\": container with ID starting with 7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4 not found: ID does not exist" containerID="7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.789012 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4"} err="failed to get container status \"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4\": rpc error: code = NotFound desc = could not find container \"7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4\": container with ID starting with 7eee322760ca053cb7b1e20c00f1f6199286b7977d4a1112aa6ee42b0fd633c4 not found: ID does not exist" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.789044 4894 scope.go:117] "RemoveContainer" containerID="256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6" Jun 13 05:52:20 crc kubenswrapper[4894]: E0613 05:52:20.789393 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6\": container with ID starting with 256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6 not found: ID does not exist" containerID="256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.789431 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6"} err="failed to get container status \"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6\": rpc error: code = NotFound desc = could not find container \"256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6\": container with ID starting with 256bf7547e7b41adaed01cefb042c1f18e1b5d924db6480b27f8022249e221f6 not found: ID does not exist" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.789455 4894 scope.go:117] "RemoveContainer" containerID="4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2" Jun 13 05:52:20 crc kubenswrapper[4894]: E0613 05:52:20.789817 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2\": container with ID starting with 4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2 not found: ID does not exist" containerID="4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2" Jun 13 05:52:20 crc kubenswrapper[4894]: I0613 05:52:20.789854 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2"} err="failed to get container status \"4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2\": rpc error: code = NotFound desc = could not find container \"4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2\": container with ID starting with 4dedc50f82c1999cbf65f82cc1c5f1c825f652b02170867bf90455e9a502ddf2 not found: ID does not exist" Jun 13 05:52:22 crc kubenswrapper[4894]: I0613 05:52:22.288307 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" path="/var/lib/kubelet/pods/efb565ad-7312-4657-bc7b-2f86bc314d14/volumes" Jun 13 05:52:23 crc kubenswrapper[4894]: I0613 05:52:23.277329 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:52:23 crc kubenswrapper[4894]: E0613 05:52:23.277596 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:52:35 crc kubenswrapper[4894]: I0613 05:52:35.277897 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:52:35 crc kubenswrapper[4894]: E0613 05:52:35.278963 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:52:49 crc kubenswrapper[4894]: I0613 05:52:49.277831 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:52:49 crc kubenswrapper[4894]: E0613 05:52:49.278968 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:53:00 crc kubenswrapper[4894]: I0613 05:53:00.277597 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:53:00 crc kubenswrapper[4894]: E0613 05:53:00.278304 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.543324 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-9xkr7"] Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544118 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="extract-content" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544136 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="extract-content" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544158 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf01611d-995f-492b-bf40-4a8ec6eb261d" containerName="container-00" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544167 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf01611d-995f-492b-bf40-4a8ec6eb261d" containerName="container-00" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544183 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="extract-content" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544191 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="extract-content" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544222 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544229 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544253 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="extract-utilities" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544261 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="extract-utilities" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544286 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="extract-utilities" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544294 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="extract-utilities" Jun 13 05:53:01 crc kubenswrapper[4894]: E0613 05:53:01.544317 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544325 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544545 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35b27ce-aa12-4753-9680-bf4654291081" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544560 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf01611d-995f-492b-bf40-4a8ec6eb261d" containerName="container-00" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.544584 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb565ad-7312-4657-bc7b-2f86bc314d14" containerName="registry-server" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.545350 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.576601 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzd9h\" (UniqueName: \"kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.576881 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.678315 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzd9h\" (UniqueName: \"kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.678480 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.678563 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.700386 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzd9h\" (UniqueName: \"kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h\") pod \"crc-debug-9xkr7\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " pod="openstack/crc-debug-9xkr7" Jun 13 05:53:01 crc kubenswrapper[4894]: I0613 05:53:01.865537 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9xkr7" Jun 13 05:53:02 crc kubenswrapper[4894]: I0613 05:53:02.085534 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-9xkr7" event={"ID":"ed985af6-be26-4f61-9384-17628cc6b73f","Type":"ContainerStarted","Data":"739294e311aec302b62047167fb330ca7aa8d9601f592942b8a3dd948825f5af"} Jun 13 05:53:03 crc kubenswrapper[4894]: I0613 05:53:03.095248 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-9xkr7" event={"ID":"ed985af6-be26-4f61-9384-17628cc6b73f","Type":"ContainerStarted","Data":"d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e"} Jun 13 05:53:03 crc kubenswrapper[4894]: I0613 05:53:03.122164 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-9xkr7" podStartSLOduration=2.122144335 podStartE2EDuration="2.122144335s" podCreationTimestamp="2025-06-13 05:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:53:03.116185196 +0000 UTC m=+3741.562432699" watchObservedRunningTime="2025-06-13 05:53:03.122144335 +0000 UTC m=+3741.568391798" Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.512836 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-9xkr7"] Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.513782 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-9xkr7" podUID="ed985af6-be26-4f61-9384-17628cc6b73f" containerName="container-00" containerID="cri-o://d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e" gracePeriod=2 Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.522905 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-9xkr7"] Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.612400 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9xkr7" Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.803912 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzd9h\" (UniqueName: \"kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h\") pod \"ed985af6-be26-4f61-9384-17628cc6b73f\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.803972 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host\") pod \"ed985af6-be26-4f61-9384-17628cc6b73f\" (UID: \"ed985af6-be26-4f61-9384-17628cc6b73f\") " Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.804148 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host" (OuterVolumeSpecName: "host") pod "ed985af6-be26-4f61-9384-17628cc6b73f" (UID: "ed985af6-be26-4f61-9384-17628cc6b73f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.804491 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed985af6-be26-4f61-9384-17628cc6b73f-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.820071 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h" (OuterVolumeSpecName: "kube-api-access-bzd9h") pod "ed985af6-be26-4f61-9384-17628cc6b73f" (UID: "ed985af6-be26-4f61-9384-17628cc6b73f"). InnerVolumeSpecName "kube-api-access-bzd9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:53:12 crc kubenswrapper[4894]: I0613 05:53:12.907325 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzd9h\" (UniqueName: \"kubernetes.io/projected/ed985af6-be26-4f61-9384-17628cc6b73f-kube-api-access-bzd9h\") on node \"crc\" DevicePath \"\"" Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.206189 4894 generic.go:334] "Generic (PLEG): container finished" podID="ed985af6-be26-4f61-9384-17628cc6b73f" containerID="d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e" exitCode=0 Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.206242 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9xkr7" Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.206277 4894 scope.go:117] "RemoveContainer" containerID="d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e" Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.238153 4894 scope.go:117] "RemoveContainer" containerID="d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e" Jun 13 05:53:13 crc kubenswrapper[4894]: E0613 05:53:13.238854 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e\": container with ID starting with d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e not found: ID does not exist" containerID="d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e" Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.238922 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e"} err="failed to get container status \"d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e\": rpc error: code = NotFound desc = could not find container \"d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e\": container with ID starting with d56005199959277d3e4a9c48a65d9562b2b1c1555701da815f9445261c49be5e not found: ID does not exist" Jun 13 05:53:13 crc kubenswrapper[4894]: I0613 05:53:13.278059 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:53:13 crc kubenswrapper[4894]: E0613 05:53:13.278857 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:53:14 crc kubenswrapper[4894]: I0613 05:53:14.292238 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed985af6-be26-4f61-9384-17628cc6b73f" path="/var/lib/kubelet/pods/ed985af6-be26-4f61-9384-17628cc6b73f/volumes" Jun 13 05:53:28 crc kubenswrapper[4894]: I0613 05:53:28.276811 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:53:28 crc kubenswrapper[4894]: E0613 05:53:28.277648 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:53:40 crc kubenswrapper[4894]: I0613 05:53:40.277202 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:53:40 crc kubenswrapper[4894]: E0613 05:53:40.278080 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:53:55 crc kubenswrapper[4894]: I0613 05:53:55.279218 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:53:55 crc kubenswrapper[4894]: E0613 05:53:55.280237 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:54:01 crc kubenswrapper[4894]: I0613 05:54:01.989007 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-lbrfl"] Jun 13 05:54:01 crc kubenswrapper[4894]: E0613 05:54:01.990444 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed985af6-be26-4f61-9384-17628cc6b73f" containerName="container-00" Jun 13 05:54:01 crc kubenswrapper[4894]: I0613 05:54:01.990467 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed985af6-be26-4f61-9384-17628cc6b73f" containerName="container-00" Jun 13 05:54:01 crc kubenswrapper[4894]: I0613 05:54:01.990870 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed985af6-be26-4f61-9384-17628cc6b73f" containerName="container-00" Jun 13 05:54:01 crc kubenswrapper[4894]: I0613 05:54:01.991854 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.126536 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.126604 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f67b\" (UniqueName: \"kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.228856 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.228932 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f67b\" (UniqueName: \"kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.229364 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.299383 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f67b\" (UniqueName: \"kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b\") pod \"crc-debug-lbrfl\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.325482 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lbrfl" Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.700775 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-lbrfl" event={"ID":"7cc64f95-7d6c-4dda-a66f-a09444f17a10","Type":"ContainerStarted","Data":"f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520"} Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.701258 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-lbrfl" event={"ID":"7cc64f95-7d6c-4dda-a66f-a09444f17a10","Type":"ContainerStarted","Data":"39c9619018c057e602fd1d650dcaca689cbeb6d974c27252ab3a936f4100b868"} Jun 13 05:54:02 crc kubenswrapper[4894]: I0613 05:54:02.717606 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-lbrfl" podStartSLOduration=1.7175772459999998 podStartE2EDuration="1.717577246s" podCreationTimestamp="2025-06-13 05:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:54:02.717127063 +0000 UTC m=+3801.163374536" watchObservedRunningTime="2025-06-13 05:54:02.717577246 +0000 UTC m=+3801.163824749" Jun 13 05:54:06 crc kubenswrapper[4894]: I0613 05:54:06.276945 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:54:06 crc kubenswrapper[4894]: E0613 05:54:06.277619 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:54:12 crc kubenswrapper[4894]: I0613 05:54:12.983036 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-lbrfl"] Jun 13 05:54:12 crc kubenswrapper[4894]: I0613 05:54:12.983803 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-lbrfl" podUID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" containerName="container-00" containerID="cri-o://f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520" gracePeriod=2 Jun 13 05:54:12 crc kubenswrapper[4894]: I0613 05:54:12.994134 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-lbrfl"] Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.061075 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lbrfl" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.171559 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host\") pod \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.171731 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host" (OuterVolumeSpecName: "host") pod "7cc64f95-7d6c-4dda-a66f-a09444f17a10" (UID: "7cc64f95-7d6c-4dda-a66f-a09444f17a10"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.171825 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f67b\" (UniqueName: \"kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b\") pod \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\" (UID: \"7cc64f95-7d6c-4dda-a66f-a09444f17a10\") " Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.172505 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cc64f95-7d6c-4dda-a66f-a09444f17a10-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.179827 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b" (OuterVolumeSpecName: "kube-api-access-2f67b") pod "7cc64f95-7d6c-4dda-a66f-a09444f17a10" (UID: "7cc64f95-7d6c-4dda-a66f-a09444f17a10"). InnerVolumeSpecName "kube-api-access-2f67b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.275470 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f67b\" (UniqueName: \"kubernetes.io/projected/7cc64f95-7d6c-4dda-a66f-a09444f17a10-kube-api-access-2f67b\") on node \"crc\" DevicePath \"\"" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.815537 4894 generic.go:334] "Generic (PLEG): container finished" podID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" containerID="f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520" exitCode=0 Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.815984 4894 scope.go:117] "RemoveContainer" containerID="f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.816209 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-lbrfl" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.852103 4894 scope.go:117] "RemoveContainer" containerID="f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520" Jun 13 05:54:13 crc kubenswrapper[4894]: E0613 05:54:13.852851 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520\": container with ID starting with f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520 not found: ID does not exist" containerID="f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520" Jun 13 05:54:13 crc kubenswrapper[4894]: I0613 05:54:13.853021 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520"} err="failed to get container status \"f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520\": rpc error: code = NotFound desc = could not find container \"f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520\": container with ID starting with f7642db0dbf59b8699eb6546fdafabcc241acc2bd818fb4a7b9465ae626fe520 not found: ID does not exist" Jun 13 05:54:14 crc kubenswrapper[4894]: I0613 05:54:14.298123 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" path="/var/lib/kubelet/pods/7cc64f95-7d6c-4dda-a66f-a09444f17a10/volumes" Jun 13 05:54:18 crc kubenswrapper[4894]: I0613 05:54:18.277022 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:54:18 crc kubenswrapper[4894]: E0613 05:54:18.277925 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:54:30 crc kubenswrapper[4894]: I0613 05:54:30.276871 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:54:30 crc kubenswrapper[4894]: E0613 05:54:30.278088 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.557958 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:31 crc kubenswrapper[4894]: E0613 05:54:31.559039 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" containerName="container-00" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.559058 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" containerName="container-00" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.559395 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc64f95-7d6c-4dda-a66f-a09444f17a10" containerName="container-00" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.563537 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.586190 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.698109 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmbq\" (UniqueName: \"kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.698207 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.698233 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.799987 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmbq\" (UniqueName: \"kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.800078 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.800103 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.800506 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.800526 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.819517 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmbq\" (UniqueName: \"kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq\") pod \"community-operators-xpknz\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:31 crc kubenswrapper[4894]: I0613 05:54:31.887094 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:32 crc kubenswrapper[4894]: I0613 05:54:32.325190 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:33 crc kubenswrapper[4894]: I0613 05:54:33.023904 4894 generic.go:334] "Generic (PLEG): container finished" podID="eee9817f-e202-4235-af25-c21be7388416" containerID="f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1" exitCode=0 Jun 13 05:54:33 crc kubenswrapper[4894]: I0613 05:54:33.024103 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerDied","Data":"f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1"} Jun 13 05:54:33 crc kubenswrapper[4894]: I0613 05:54:33.024343 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerStarted","Data":"36f230558ad82df431e7823e02b8b39e0f2c61ea3ac98985408c054bcbca2f01"} Jun 13 05:54:34 crc kubenswrapper[4894]: I0613 05:54:34.034614 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerStarted","Data":"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8"} Jun 13 05:54:35 crc kubenswrapper[4894]: I0613 05:54:35.046572 4894 generic.go:334] "Generic (PLEG): container finished" podID="eee9817f-e202-4235-af25-c21be7388416" containerID="0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8" exitCode=0 Jun 13 05:54:35 crc kubenswrapper[4894]: I0613 05:54:35.046609 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerDied","Data":"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8"} Jun 13 05:54:36 crc kubenswrapper[4894]: I0613 05:54:36.060062 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerStarted","Data":"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e"} Jun 13 05:54:41 crc kubenswrapper[4894]: I0613 05:54:41.888012 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:41 crc kubenswrapper[4894]: I0613 05:54:41.889380 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:41 crc kubenswrapper[4894]: I0613 05:54:41.981994 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:42 crc kubenswrapper[4894]: I0613 05:54:42.024047 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xpknz" podStartSLOduration=8.592028619 podStartE2EDuration="11.024022641s" podCreationTimestamp="2025-06-13 05:54:31 +0000 UTC" firstStartedPulling="2025-06-13 05:54:33.026526491 +0000 UTC m=+3831.472773994" lastFinishedPulling="2025-06-13 05:54:35.458520533 +0000 UTC m=+3833.904768016" observedRunningTime="2025-06-13 05:54:36.086327446 +0000 UTC m=+3834.532574919" watchObservedRunningTime="2025-06-13 05:54:42.024022641 +0000 UTC m=+3840.470270144" Jun 13 05:54:42 crc kubenswrapper[4894]: I0613 05:54:42.193477 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:42 crc kubenswrapper[4894]: I0613 05:54:42.266024 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.151825 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xpknz" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="registry-server" containerID="cri-o://487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e" gracePeriod=2 Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.279802 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:54:44 crc kubenswrapper[4894]: E0613 05:54:44.280570 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.678273 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.834239 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmbq\" (UniqueName: \"kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq\") pod \"eee9817f-e202-4235-af25-c21be7388416\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.834721 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities\") pod \"eee9817f-e202-4235-af25-c21be7388416\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.834823 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content\") pod \"eee9817f-e202-4235-af25-c21be7388416\" (UID: \"eee9817f-e202-4235-af25-c21be7388416\") " Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.836297 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities" (OuterVolumeSpecName: "utilities") pod "eee9817f-e202-4235-af25-c21be7388416" (UID: "eee9817f-e202-4235-af25-c21be7388416"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.841198 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq" (OuterVolumeSpecName: "kube-api-access-nsmbq") pod "eee9817f-e202-4235-af25-c21be7388416" (UID: "eee9817f-e202-4235-af25-c21be7388416"). InnerVolumeSpecName "kube-api-access-nsmbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.898439 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee9817f-e202-4235-af25-c21be7388416" (UID: "eee9817f-e202-4235-af25-c21be7388416"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.938213 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsmbq\" (UniqueName: \"kubernetes.io/projected/eee9817f-e202-4235-af25-c21be7388416-kube-api-access-nsmbq\") on node \"crc\" DevicePath \"\"" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.938247 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:54:44 crc kubenswrapper[4894]: I0613 05:54:44.938264 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9817f-e202-4235-af25-c21be7388416-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.166160 4894 generic.go:334] "Generic (PLEG): container finished" podID="eee9817f-e202-4235-af25-c21be7388416" containerID="487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e" exitCode=0 Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.166244 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerDied","Data":"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e"} Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.166346 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpknz" event={"ID":"eee9817f-e202-4235-af25-c21be7388416","Type":"ContainerDied","Data":"36f230558ad82df431e7823e02b8b39e0f2c61ea3ac98985408c054bcbca2f01"} Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.166408 4894 scope.go:117] "RemoveContainer" containerID="487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.166423 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpknz" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.220743 4894 scope.go:117] "RemoveContainer" containerID="0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.232994 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.241686 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xpknz"] Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.257265 4894 scope.go:117] "RemoveContainer" containerID="f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.317021 4894 scope.go:117] "RemoveContainer" containerID="487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e" Jun 13 05:54:45 crc kubenswrapper[4894]: E0613 05:54:45.317407 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e\": container with ID starting with 487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e not found: ID does not exist" containerID="487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.317460 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e"} err="failed to get container status \"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e\": rpc error: code = NotFound desc = could not find container \"487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e\": container with ID starting with 487da559cdbeaf795beb77741d92582e7cc1554d07b690f928fffee65a46ef0e not found: ID does not exist" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.317492 4894 scope.go:117] "RemoveContainer" containerID="0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8" Jun 13 05:54:45 crc kubenswrapper[4894]: E0613 05:54:45.317857 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8\": container with ID starting with 0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8 not found: ID does not exist" containerID="0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.317891 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8"} err="failed to get container status \"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8\": rpc error: code = NotFound desc = could not find container \"0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8\": container with ID starting with 0f1f5672867624a4f904e286faab0e8ae3f86e886e138f8ffa15058509f16ca8 not found: ID does not exist" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.317917 4894 scope.go:117] "RemoveContainer" containerID="f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1" Jun 13 05:54:45 crc kubenswrapper[4894]: E0613 05:54:45.318283 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1\": container with ID starting with f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1 not found: ID does not exist" containerID="f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1" Jun 13 05:54:45 crc kubenswrapper[4894]: I0613 05:54:45.318344 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1"} err="failed to get container status \"f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1\": rpc error: code = NotFound desc = could not find container \"f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1\": container with ID starting with f9b339e096e65ca12d330d9e1eeff303d20e2ab51f70e8fd2b8e537c652bd8d1 not found: ID does not exist" Jun 13 05:54:46 crc kubenswrapper[4894]: I0613 05:54:46.290608 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee9817f-e202-4235-af25-c21be7388416" path="/var/lib/kubelet/pods/eee9817f-e202-4235-af25-c21be7388416/volumes" Jun 13 05:54:58 crc kubenswrapper[4894]: I0613 05:54:58.278354 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:54:59 crc kubenswrapper[4894]: I0613 05:54:59.374456 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c"} Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.394323 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-qsz6z"] Jun 13 05:55:02 crc kubenswrapper[4894]: E0613 05:55:02.395121 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="extract-content" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.395132 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="extract-content" Jun 13 05:55:02 crc kubenswrapper[4894]: E0613 05:55:02.395146 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="extract-utilities" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.395152 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="extract-utilities" Jun 13 05:55:02 crc kubenswrapper[4894]: E0613 05:55:02.395166 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="registry-server" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.395172 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="registry-server" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.395340 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee9817f-e202-4235-af25-c21be7388416" containerName="registry-server" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.395922 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.528469 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.529210 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gggp\" (UniqueName: \"kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.630927 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.631077 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.631454 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gggp\" (UniqueName: \"kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.664034 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gggp\" (UniqueName: \"kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp\") pod \"crc-debug-qsz6z\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " pod="openstack/crc-debug-qsz6z" Jun 13 05:55:02 crc kubenswrapper[4894]: I0613 05:55:02.717107 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-qsz6z" Jun 13 05:55:03 crc kubenswrapper[4894]: I0613 05:55:03.058454 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-lg282"] Jun 13 05:55:03 crc kubenswrapper[4894]: I0613 05:55:03.069534 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-lg282"] Jun 13 05:55:03 crc kubenswrapper[4894]: I0613 05:55:03.409092 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-qsz6z" event={"ID":"3a3e7d34-a502-4f65-a143-cd293f3bb8ca","Type":"ContainerStarted","Data":"b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d"} Jun 13 05:55:03 crc kubenswrapper[4894]: I0613 05:55:03.410115 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-qsz6z" event={"ID":"3a3e7d34-a502-4f65-a143-cd293f3bb8ca","Type":"ContainerStarted","Data":"b7d0a21dbbd8a598c76e8bb273c66329d8ae80c212d0d3e1d0df8342337a20f8"} Jun 13 05:55:03 crc kubenswrapper[4894]: I0613 05:55:03.427560 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-qsz6z" podStartSLOduration=1.427544049 podStartE2EDuration="1.427544049s" podCreationTimestamp="2025-06-13 05:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:55:03.426052317 +0000 UTC m=+3861.872299770" watchObservedRunningTime="2025-06-13 05:55:03.427544049 +0000 UTC m=+3861.873791522" Jun 13 05:55:04 crc kubenswrapper[4894]: I0613 05:55:04.290220 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2288fc24-1bb7-4f72-bfbf-bab43156306e" path="/var/lib/kubelet/pods/2288fc24-1bb7-4f72-bfbf-bab43156306e/volumes" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.047379 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-ba53-account-create-54cs2"] Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.061360 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-ba53-account-create-54cs2"] Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.370018 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-qsz6z"] Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.370323 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-qsz6z" podUID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" containerName="container-00" containerID="cri-o://b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d" gracePeriod=2 Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.381515 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-qsz6z"] Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.465569 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-qsz6z" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.531337 4894 generic.go:334] "Generic (PLEG): container finished" podID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" containerID="b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d" exitCode=0 Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.531410 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-qsz6z" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.531427 4894 scope.go:117] "RemoveContainer" containerID="b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.560219 4894 scope.go:117] "RemoveContainer" containerID="b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d" Jun 13 05:55:13 crc kubenswrapper[4894]: E0613 05:55:13.560632 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d\": container with ID starting with b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d not found: ID does not exist" containerID="b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.560796 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d"} err="failed to get container status \"b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d\": rpc error: code = NotFound desc = could not find container \"b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d\": container with ID starting with b21c1479a119df048fd567f9a535c187785309b32a09986c425eee3a9d67620d not found: ID does not exist" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.580037 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gggp\" (UniqueName: \"kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp\") pod \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.580111 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host\") pod \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\" (UID: \"3a3e7d34-a502-4f65-a143-cd293f3bb8ca\") " Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.580172 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host" (OuterVolumeSpecName: "host") pod "3a3e7d34-a502-4f65-a143-cd293f3bb8ca" (UID: "3a3e7d34-a502-4f65-a143-cd293f3bb8ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.580594 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.594758 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp" (OuterVolumeSpecName: "kube-api-access-4gggp") pod "3a3e7d34-a502-4f65-a143-cd293f3bb8ca" (UID: "3a3e7d34-a502-4f65-a143-cd293f3bb8ca"). InnerVolumeSpecName "kube-api-access-4gggp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:55:13 crc kubenswrapper[4894]: I0613 05:55:13.682492 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gggp\" (UniqueName: \"kubernetes.io/projected/3a3e7d34-a502-4f65-a143-cd293f3bb8ca-kube-api-access-4gggp\") on node \"crc\" DevicePath \"\"" Jun 13 05:55:14 crc kubenswrapper[4894]: I0613 05:55:14.295013 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" path="/var/lib/kubelet/pods/3a3e7d34-a502-4f65-a143-cd293f3bb8ca/volumes" Jun 13 05:55:14 crc kubenswrapper[4894]: I0613 05:55:14.296604 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654e1e96-bce6-4bb5-8437-d316769e5104" path="/var/lib/kubelet/pods/654e1e96-bce6-4bb5-8437-d316769e5104/volumes" Jun 13 05:55:18 crc kubenswrapper[4894]: I0613 05:55:18.453074 4894 scope.go:117] "RemoveContainer" containerID="70c8cd870760fb9644dfc2cc9acd46b78d94c5117f5172196a2c79f7aec7e5fb" Jun 13 05:55:18 crc kubenswrapper[4894]: I0613 05:55:18.490495 4894 scope.go:117] "RemoveContainer" containerID="0f3727ce100f25da2de288825900b7609beb440e4f8fa1dd6fd2d6e9cad40dd9" Jun 13 05:55:34 crc kubenswrapper[4894]: I0613 05:55:34.055931 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-mqtjp"] Jun 13 05:55:34 crc kubenswrapper[4894]: I0613 05:55:34.071342 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-mqtjp"] Jun 13 05:55:34 crc kubenswrapper[4894]: I0613 05:55:34.295998 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472fb412-9ee0-4bdb-b7e0-ec470d468b4b" path="/var/lib/kubelet/pods/472fb412-9ee0-4bdb-b7e0-ec470d468b4b/volumes" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.802515 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-hj5gm"] Jun 13 05:56:01 crc kubenswrapper[4894]: E0613 05:56:01.803755 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" containerName="container-00" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.803777 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" containerName="container-00" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.804113 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e7d34-a502-4f65-a143-cd293f3bb8ca" containerName="container-00" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.805095 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-hj5gm" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.961278 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:01 crc kubenswrapper[4894]: I0613 05:56:01.963650 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjx55\" (UniqueName: \"kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: I0613 05:56:02.066473 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjx55\" (UniqueName: \"kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: I0613 05:56:02.066693 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: I0613 05:56:02.067005 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: I0613 05:56:02.092754 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjx55\" (UniqueName: \"kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55\") pod \"crc-debug-hj5gm\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: I0613 05:56:02.137932 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-hj5gm" Jun 13 05:56:02 crc kubenswrapper[4894]: W0613 05:56:02.178085 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod685d5347_8370_40a6_b8a6_49f15317ed30.slice/crio-ab1ba75486a9cc00146016362bd8300c4bc9631b9163c34a174a7c9121be0a29 WatchSource:0}: Error finding container ab1ba75486a9cc00146016362bd8300c4bc9631b9163c34a174a7c9121be0a29: Status 404 returned error can't find the container with id ab1ba75486a9cc00146016362bd8300c4bc9631b9163c34a174a7c9121be0a29 Jun 13 05:56:03 crc kubenswrapper[4894]: I0613 05:56:03.081889 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-hj5gm" event={"ID":"685d5347-8370-40a6-b8a6-49f15317ed30","Type":"ContainerStarted","Data":"efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10"} Jun 13 05:56:03 crc kubenswrapper[4894]: I0613 05:56:03.083525 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-hj5gm" event={"ID":"685d5347-8370-40a6-b8a6-49f15317ed30","Type":"ContainerStarted","Data":"ab1ba75486a9cc00146016362bd8300c4bc9631b9163c34a174a7c9121be0a29"} Jun 13 05:56:03 crc kubenswrapper[4894]: I0613 05:56:03.113350 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-hj5gm" podStartSLOduration=2.113325093 podStartE2EDuration="2.113325093s" podCreationTimestamp="2025-06-13 05:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:56:03.103796634 +0000 UTC m=+3921.550044117" watchObservedRunningTime="2025-06-13 05:56:03.113325093 +0000 UTC m=+3921.559572586" Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.713118 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-hj5gm"] Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.713987 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-hj5gm" podUID="685d5347-8370-40a6-b8a6-49f15317ed30" containerName="container-00" containerID="cri-o://efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10" gracePeriod=2 Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.733388 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-hj5gm"] Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.812172 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-hj5gm" Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.925556 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjx55\" (UniqueName: \"kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55\") pod \"685d5347-8370-40a6-b8a6-49f15317ed30\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.925938 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host\") pod \"685d5347-8370-40a6-b8a6-49f15317ed30\" (UID: \"685d5347-8370-40a6-b8a6-49f15317ed30\") " Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.926061 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host" (OuterVolumeSpecName: "host") pod "685d5347-8370-40a6-b8a6-49f15317ed30" (UID: "685d5347-8370-40a6-b8a6-49f15317ed30"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.926951 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/685d5347-8370-40a6-b8a6-49f15317ed30-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:56:12 crc kubenswrapper[4894]: I0613 05:56:12.934346 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55" (OuterVolumeSpecName: "kube-api-access-bjx55") pod "685d5347-8370-40a6-b8a6-49f15317ed30" (UID: "685d5347-8370-40a6-b8a6-49f15317ed30"). InnerVolumeSpecName "kube-api-access-bjx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.028253 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjx55\" (UniqueName: \"kubernetes.io/projected/685d5347-8370-40a6-b8a6-49f15317ed30-kube-api-access-bjx55\") on node \"crc\" DevicePath \"\"" Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.195860 4894 generic.go:334] "Generic (PLEG): container finished" podID="685d5347-8370-40a6-b8a6-49f15317ed30" containerID="efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10" exitCode=0 Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.195906 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-hj5gm" Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.195943 4894 scope.go:117] "RemoveContainer" containerID="efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10" Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.222588 4894 scope.go:117] "RemoveContainer" containerID="efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10" Jun 13 05:56:13 crc kubenswrapper[4894]: E0613 05:56:13.223173 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10\": container with ID starting with efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10 not found: ID does not exist" containerID="efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10" Jun 13 05:56:13 crc kubenswrapper[4894]: I0613 05:56:13.223208 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10"} err="failed to get container status \"efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10\": rpc error: code = NotFound desc = could not find container \"efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10\": container with ID starting with efc90e7d37b0922d117e0c434c7882285a11671ca4a2be3ab7ba0919bcdbde10 not found: ID does not exist" Jun 13 05:56:14 crc kubenswrapper[4894]: I0613 05:56:14.295133 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685d5347-8370-40a6-b8a6-49f15317ed30" path="/var/lib/kubelet/pods/685d5347-8370-40a6-b8a6-49f15317ed30/volumes" Jun 13 05:56:18 crc kubenswrapper[4894]: I0613 05:56:18.666404 4894 scope.go:117] "RemoveContainer" containerID="b6de53990a689386be7d5e33b2267a83e96aac4ef9647b46de766f7a7663f3e4" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.085030 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-nqgcz"] Jun 13 05:57:02 crc kubenswrapper[4894]: E0613 05:57:02.086253 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685d5347-8370-40a6-b8a6-49f15317ed30" containerName="container-00" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.086275 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="685d5347-8370-40a6-b8a6-49f15317ed30" containerName="container-00" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.086484 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="685d5347-8370-40a6-b8a6-49f15317ed30" containerName="container-00" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.087268 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.158737 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.158826 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drss\" (UniqueName: \"kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.259906 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.260001 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8drss\" (UniqueName: \"kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.260391 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.282120 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drss\" (UniqueName: \"kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss\") pod \"crc-debug-nqgcz\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.408095 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nqgcz" Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.803393 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-nqgcz" event={"ID":"4c9f5786-4cfd-4752-98c6-aca92894a99d","Type":"ContainerStarted","Data":"088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d"} Jun 13 05:57:02 crc kubenswrapper[4894]: I0613 05:57:02.803733 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-nqgcz" event={"ID":"4c9f5786-4cfd-4752-98c6-aca92894a99d","Type":"ContainerStarted","Data":"6a4ef7043f566dc22a8e7ae31c53c69b132769e4023d757577769bc0779c106b"} Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.090166 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-nqgcz" podStartSLOduration=11.090145329 podStartE2EDuration="11.090145329s" podCreationTimestamp="2025-06-13 05:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:57:02.824195238 +0000 UTC m=+3981.270442711" watchObservedRunningTime="2025-06-13 05:57:13.090145329 +0000 UTC m=+3991.536392802" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.093095 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-nqgcz"] Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.093348 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-nqgcz" podUID="4c9f5786-4cfd-4752-98c6-aca92894a99d" containerName="container-00" containerID="cri-o://088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d" gracePeriod=2 Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.103769 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-nqgcz"] Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.178052 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nqgcz" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.199844 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8drss\" (UniqueName: \"kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss\") pod \"4c9f5786-4cfd-4752-98c6-aca92894a99d\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.200981 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host\") pod \"4c9f5786-4cfd-4752-98c6-aca92894a99d\" (UID: \"4c9f5786-4cfd-4752-98c6-aca92894a99d\") " Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.201041 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host" (OuterVolumeSpecName: "host") pod "4c9f5786-4cfd-4752-98c6-aca92894a99d" (UID: "4c9f5786-4cfd-4752-98c6-aca92894a99d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.201856 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c9f5786-4cfd-4752-98c6-aca92894a99d-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.208368 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss" (OuterVolumeSpecName: "kube-api-access-8drss") pod "4c9f5786-4cfd-4752-98c6-aca92894a99d" (UID: "4c9f5786-4cfd-4752-98c6-aca92894a99d"). InnerVolumeSpecName "kube-api-access-8drss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.303609 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8drss\" (UniqueName: \"kubernetes.io/projected/4c9f5786-4cfd-4752-98c6-aca92894a99d-kube-api-access-8drss\") on node \"crc\" DevicePath \"\"" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.921248 4894 generic.go:334] "Generic (PLEG): container finished" podID="4c9f5786-4cfd-4752-98c6-aca92894a99d" containerID="088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d" exitCode=0 Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.921339 4894 scope.go:117] "RemoveContainer" containerID="088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.921346 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-nqgcz" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.956622 4894 scope.go:117] "RemoveContainer" containerID="088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d" Jun 13 05:57:13 crc kubenswrapper[4894]: E0613 05:57:13.957204 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d\": container with ID starting with 088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d not found: ID does not exist" containerID="088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d" Jun 13 05:57:13 crc kubenswrapper[4894]: I0613 05:57:13.957228 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d"} err="failed to get container status \"088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d\": rpc error: code = NotFound desc = could not find container \"088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d\": container with ID starting with 088384f30abda2dcb7fafea5db054e54cade474ed191e45012d1d8fc5dbe374d not found: ID does not exist" Jun 13 05:57:14 crc kubenswrapper[4894]: I0613 05:57:14.294488 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9f5786-4cfd-4752-98c6-aca92894a99d" path="/var/lib/kubelet/pods/4c9f5786-4cfd-4752-98c6-aca92894a99d/volumes" Jun 13 05:57:26 crc kubenswrapper[4894]: I0613 05:57:26.236727 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:57:26 crc kubenswrapper[4894]: I0613 05:57:26.237303 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:57:56 crc kubenswrapper[4894]: I0613 05:57:56.236496 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:57:56 crc kubenswrapper[4894]: I0613 05:57:56.237021 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.540093 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-mp2sz"] Jun 13 05:58:01 crc kubenswrapper[4894]: E0613 05:58:01.540883 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9f5786-4cfd-4752-98c6-aca92894a99d" containerName="container-00" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.540896 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9f5786-4cfd-4752-98c6-aca92894a99d" containerName="container-00" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.541091 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9f5786-4cfd-4752-98c6-aca92894a99d" containerName="container-00" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.541630 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.633143 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzz8j\" (UniqueName: \"kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.633470 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.735531 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.735691 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.735709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzz8j\" (UniqueName: \"kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.762935 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzz8j\" (UniqueName: \"kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j\") pod \"crc-debug-mp2sz\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " pod="openstack/crc-debug-mp2sz" Jun 13 05:58:01 crc kubenswrapper[4894]: I0613 05:58:01.872517 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mp2sz" Jun 13 05:58:02 crc kubenswrapper[4894]: I0613 05:58:02.378960 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mp2sz" event={"ID":"149be97a-ec32-4f68-884c-1147224ad870","Type":"ContainerStarted","Data":"2cc0177cd087d1860336a079320e818fd16f6cc2526e66ea9a00e1394397240c"} Jun 13 05:58:02 crc kubenswrapper[4894]: I0613 05:58:02.379325 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mp2sz" event={"ID":"149be97a-ec32-4f68-884c-1147224ad870","Type":"ContainerStarted","Data":"4087962cd0380e2b4372b8c231ede13eb68c151e902ccdbd2fb75ad6362c1ce6"} Jun 13 05:58:02 crc kubenswrapper[4894]: I0613 05:58:02.405498 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-mp2sz" podStartSLOduration=1.405473668 podStartE2EDuration="1.405473668s" podCreationTimestamp="2025-06-13 05:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:58:02.397947685 +0000 UTC m=+4040.844195188" watchObservedRunningTime="2025-06-13 05:58:02.405473668 +0000 UTC m=+4040.851721171" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.421736 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-mp2sz"] Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.422500 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-mp2sz" podUID="149be97a-ec32-4f68-884c-1147224ad870" containerName="container-00" containerID="cri-o://2cc0177cd087d1860336a079320e818fd16f6cc2526e66ea9a00e1394397240c" gracePeriod=2 Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.440231 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-mp2sz"] Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.466819 4894 generic.go:334] "Generic (PLEG): container finished" podID="149be97a-ec32-4f68-884c-1147224ad870" containerID="2cc0177cd087d1860336a079320e818fd16f6cc2526e66ea9a00e1394397240c" exitCode=0 Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.467049 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4087962cd0380e2b4372b8c231ede13eb68c151e902ccdbd2fb75ad6362c1ce6" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.506420 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mp2sz" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.599873 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzz8j\" (UniqueName: \"kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j\") pod \"149be97a-ec32-4f68-884c-1147224ad870\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.599994 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host\") pod \"149be97a-ec32-4f68-884c-1147224ad870\" (UID: \"149be97a-ec32-4f68-884c-1147224ad870\") " Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.600149 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host" (OuterVolumeSpecName: "host") pod "149be97a-ec32-4f68-884c-1147224ad870" (UID: "149be97a-ec32-4f68-884c-1147224ad870"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.600490 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/149be97a-ec32-4f68-884c-1147224ad870-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.606338 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j" (OuterVolumeSpecName: "kube-api-access-bzz8j") pod "149be97a-ec32-4f68-884c-1147224ad870" (UID: "149be97a-ec32-4f68-884c-1147224ad870"). InnerVolumeSpecName "kube-api-access-bzz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:58:12 crc kubenswrapper[4894]: I0613 05:58:12.702226 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzz8j\" (UniqueName: \"kubernetes.io/projected/149be97a-ec32-4f68-884c-1147224ad870-kube-api-access-bzz8j\") on node \"crc\" DevicePath \"\"" Jun 13 05:58:13 crc kubenswrapper[4894]: I0613 05:58:13.474733 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mp2sz" Jun 13 05:58:14 crc kubenswrapper[4894]: I0613 05:58:14.286754 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149be97a-ec32-4f68-884c-1147224ad870" path="/var/lib/kubelet/pods/149be97a-ec32-4f68-884c-1147224ad870/volumes" Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.236475 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.237006 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.237055 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.237718 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.237785 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c" gracePeriod=600 Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.606425 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c" exitCode=0 Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.606740 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c"} Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.606791 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0"} Jun 13 05:58:26 crc kubenswrapper[4894]: I0613 05:58:26.606810 4894 scope.go:117] "RemoveContainer" containerID="4b3e07070f8ccd40ee6fa435995f8cebb8929047b0b70ec3b940e78054e80a8d" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.802084 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-4crsh"] Jun 13 05:59:01 crc kubenswrapper[4894]: E0613 05:59:01.803310 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149be97a-ec32-4f68-884c-1147224ad870" containerName="container-00" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.803334 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="149be97a-ec32-4f68-884c-1147224ad870" containerName="container-00" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.803682 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="149be97a-ec32-4f68-884c-1147224ad870" containerName="container-00" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.804750 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4crsh" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.854931 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.855012 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4lc\" (UniqueName: \"kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.957246 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4lc\" (UniqueName: \"kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.957593 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:01 crc kubenswrapper[4894]: I0613 05:59:01.957795 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:02 crc kubenswrapper[4894]: I0613 05:59:02.003885 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4lc\" (UniqueName: \"kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc\") pod \"crc-debug-4crsh\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " pod="openstack/crc-debug-4crsh" Jun 13 05:59:02 crc kubenswrapper[4894]: I0613 05:59:02.134413 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4crsh" Jun 13 05:59:03 crc kubenswrapper[4894]: I0613 05:59:03.005382 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4crsh" event={"ID":"c6bf634c-39e6-4f31-b878-900492da1da4","Type":"ContainerStarted","Data":"2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5"} Jun 13 05:59:03 crc kubenswrapper[4894]: I0613 05:59:03.006023 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4crsh" event={"ID":"c6bf634c-39e6-4f31-b878-900492da1da4","Type":"ContainerStarted","Data":"f80a4fd411ba6aaefd64ea36fe0de6c509765aaf162738f7e71b0f4d479196c4"} Jun 13 05:59:03 crc kubenswrapper[4894]: I0613 05:59:03.040188 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-4crsh" podStartSLOduration=2.040171514 podStartE2EDuration="2.040171514s" podCreationTimestamp="2025-06-13 05:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 05:59:03.027842406 +0000 UTC m=+4101.474089909" watchObservedRunningTime="2025-06-13 05:59:03.040171514 +0000 UTC m=+4101.486418977" Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.679730 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-4crsh"] Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.680647 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-4crsh" podUID="c6bf634c-39e6-4f31-b878-900492da1da4" containerName="container-00" containerID="cri-o://2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5" gracePeriod=2 Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.695994 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-4crsh"] Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.776458 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4crsh" Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.804349 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host\") pod \"c6bf634c-39e6-4f31-b878-900492da1da4\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.804407 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4lc\" (UniqueName: \"kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc\") pod \"c6bf634c-39e6-4f31-b878-900492da1da4\" (UID: \"c6bf634c-39e6-4f31-b878-900492da1da4\") " Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.804476 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host" (OuterVolumeSpecName: "host") pod "c6bf634c-39e6-4f31-b878-900492da1da4" (UID: "c6bf634c-39e6-4f31-b878-900492da1da4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.805066 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6bf634c-39e6-4f31-b878-900492da1da4-host\") on node \"crc\" DevicePath \"\"" Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.815813 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc" (OuterVolumeSpecName: "kube-api-access-dw4lc") pod "c6bf634c-39e6-4f31-b878-900492da1da4" (UID: "c6bf634c-39e6-4f31-b878-900492da1da4"). InnerVolumeSpecName "kube-api-access-dw4lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:59:12 crc kubenswrapper[4894]: I0613 05:59:12.907111 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw4lc\" (UniqueName: \"kubernetes.io/projected/c6bf634c-39e6-4f31-b878-900492da1da4-kube-api-access-dw4lc\") on node \"crc\" DevicePath \"\"" Jun 13 05:59:13 crc kubenswrapper[4894]: I0613 05:59:13.105692 4894 generic.go:334] "Generic (PLEG): container finished" podID="c6bf634c-39e6-4f31-b878-900492da1da4" containerID="2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5" exitCode=0 Jun 13 05:59:13 crc kubenswrapper[4894]: I0613 05:59:13.105739 4894 scope.go:117] "RemoveContainer" containerID="2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5" Jun 13 05:59:13 crc kubenswrapper[4894]: I0613 05:59:13.105833 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4crsh" Jun 13 05:59:13 crc kubenswrapper[4894]: I0613 05:59:13.151572 4894 scope.go:117] "RemoveContainer" containerID="2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5" Jun 13 05:59:13 crc kubenswrapper[4894]: E0613 05:59:13.152191 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5\": container with ID starting with 2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5 not found: ID does not exist" containerID="2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5" Jun 13 05:59:13 crc kubenswrapper[4894]: I0613 05:59:13.152243 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5"} err="failed to get container status \"2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5\": rpc error: code = NotFound desc = could not find container \"2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5\": container with ID starting with 2271853103a53b49af5cdb4acc5065e5b1ee17acd6f7983938bec59236a7d9b5 not found: ID does not exist" Jun 13 05:59:14 crc kubenswrapper[4894]: I0613 05:59:14.296264 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bf634c-39e6-4f31-b878-900492da1da4" path="/var/lib/kubelet/pods/c6bf634c-39e6-4f31-b878-900492da1da4/volumes" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.511017 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:19 crc kubenswrapper[4894]: E0613 05:59:19.512140 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bf634c-39e6-4f31-b878-900492da1da4" containerName="container-00" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.512157 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bf634c-39e6-4f31-b878-900492da1da4" containerName="container-00" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.512384 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bf634c-39e6-4f31-b878-900492da1da4" containerName="container-00" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.513747 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.536829 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.549096 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.549260 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgz72\" (UniqueName: \"kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.549674 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.650894 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.650955 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.651016 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgz72\" (UniqueName: \"kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.651434 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.651475 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.673605 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgz72\" (UniqueName: \"kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72\") pod \"redhat-operators-rmd6v\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:19 crc kubenswrapper[4894]: I0613 05:59:19.843977 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:20 crc kubenswrapper[4894]: I0613 05:59:20.214337 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:21 crc kubenswrapper[4894]: I0613 05:59:21.222186 4894 generic.go:334] "Generic (PLEG): container finished" podID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerID="2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d" exitCode=0 Jun 13 05:59:21 crc kubenswrapper[4894]: I0613 05:59:21.222284 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerDied","Data":"2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d"} Jun 13 05:59:21 crc kubenswrapper[4894]: I0613 05:59:21.222803 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerStarted","Data":"7f783235c73e3bc53cd4e3588d9c2f395f34d4d16fe7200e560e7347d85444c2"} Jun 13 05:59:21 crc kubenswrapper[4894]: I0613 05:59:21.227773 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 05:59:22 crc kubenswrapper[4894]: I0613 05:59:22.239291 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerStarted","Data":"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a"} Jun 13 05:59:25 crc kubenswrapper[4894]: I0613 05:59:25.269644 4894 generic.go:334] "Generic (PLEG): container finished" podID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerID="9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a" exitCode=0 Jun 13 05:59:25 crc kubenswrapper[4894]: I0613 05:59:25.269710 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerDied","Data":"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a"} Jun 13 05:59:26 crc kubenswrapper[4894]: I0613 05:59:26.296081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerStarted","Data":"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec"} Jun 13 05:59:26 crc kubenswrapper[4894]: I0613 05:59:26.325979 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rmd6v" podStartSLOduration=2.829970609 podStartE2EDuration="7.325960366s" podCreationTimestamp="2025-06-13 05:59:19 +0000 UTC" firstStartedPulling="2025-06-13 05:59:21.227216218 +0000 UTC m=+4119.673463691" lastFinishedPulling="2025-06-13 05:59:25.723205975 +0000 UTC m=+4124.169453448" observedRunningTime="2025-06-13 05:59:26.31936399 +0000 UTC m=+4124.765611463" watchObservedRunningTime="2025-06-13 05:59:26.325960366 +0000 UTC m=+4124.772207839" Jun 13 05:59:29 crc kubenswrapper[4894]: I0613 05:59:29.844443 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:29 crc kubenswrapper[4894]: I0613 05:59:29.845166 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:30 crc kubenswrapper[4894]: I0613 05:59:30.903232 4894 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rmd6v" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="registry-server" probeResult="failure" output=< Jun 13 05:59:30 crc kubenswrapper[4894]: timeout: failed to connect service ":50051" within 1s Jun 13 05:59:30 crc kubenswrapper[4894]: > Jun 13 05:59:39 crc kubenswrapper[4894]: I0613 05:59:39.906002 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:39 crc kubenswrapper[4894]: I0613 05:59:39.987439 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:40 crc kubenswrapper[4894]: I0613 05:59:40.152381 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:41 crc kubenswrapper[4894]: I0613 05:59:41.434945 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rmd6v" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="registry-server" containerID="cri-o://1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec" gracePeriod=2 Jun 13 05:59:41 crc kubenswrapper[4894]: I0613 05:59:41.967483 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.026357 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities\") pod \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.026700 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgz72\" (UniqueName: \"kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72\") pod \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.026815 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content\") pod \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\" (UID: \"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a\") " Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.027203 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities" (OuterVolumeSpecName: "utilities") pod "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" (UID: "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.027368 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.033884 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72" (OuterVolumeSpecName: "kube-api-access-tgz72") pod "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" (UID: "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a"). InnerVolumeSpecName "kube-api-access-tgz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.101107 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" (UID: "19e2d7f0-6d14-457b-9f3a-eff04efc4e2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.128094 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgz72\" (UniqueName: \"kubernetes.io/projected/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-kube-api-access-tgz72\") on node \"crc\" DevicePath \"\"" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.128121 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.449841 4894 generic.go:334] "Generic (PLEG): container finished" podID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerID="1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec" exitCode=0 Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.449911 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerDied","Data":"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec"} Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.450020 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmd6v" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.450090 4894 scope.go:117] "RemoveContainer" containerID="1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.450059 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmd6v" event={"ID":"19e2d7f0-6d14-457b-9f3a-eff04efc4e2a","Type":"ContainerDied","Data":"7f783235c73e3bc53cd4e3588d9c2f395f34d4d16fe7200e560e7347d85444c2"} Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.486649 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.500785 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rmd6v"] Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.511191 4894 scope.go:117] "RemoveContainer" containerID="9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.564020 4894 scope.go:117] "RemoveContainer" containerID="2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.590129 4894 scope.go:117] "RemoveContainer" containerID="1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec" Jun 13 05:59:42 crc kubenswrapper[4894]: E0613 05:59:42.590729 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec\": container with ID starting with 1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec not found: ID does not exist" containerID="1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.590835 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec"} err="failed to get container status \"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec\": rpc error: code = NotFound desc = could not find container \"1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec\": container with ID starting with 1c20ec0facb879794d06962f1a570ff2e2d98de8174d69c7520204b481db3cec not found: ID does not exist" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.590931 4894 scope.go:117] "RemoveContainer" containerID="9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a" Jun 13 05:59:42 crc kubenswrapper[4894]: E0613 05:59:42.591557 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a\": container with ID starting with 9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a not found: ID does not exist" containerID="9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.591598 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a"} err="failed to get container status \"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a\": rpc error: code = NotFound desc = could not find container \"9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a\": container with ID starting with 9264bfe7e6ca9b0f082e3223ae331a25904cb518c6f7cf2553c7ae6dc8e0f37a not found: ID does not exist" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.591628 4894 scope.go:117] "RemoveContainer" containerID="2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d" Jun 13 05:59:42 crc kubenswrapper[4894]: E0613 05:59:42.591948 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d\": container with ID starting with 2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d not found: ID does not exist" containerID="2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d" Jun 13 05:59:42 crc kubenswrapper[4894]: I0613 05:59:42.591981 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d"} err="failed to get container status \"2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d\": rpc error: code = NotFound desc = could not find container \"2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d\": container with ID starting with 2cae1c864df9201ecc9eb9653bbe5c1a9cdba03de68fa837ffcf1f4113c1963d not found: ID does not exist" Jun 13 05:59:44 crc kubenswrapper[4894]: I0613 05:59:44.296559 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" path="/var/lib/kubelet/pods/19e2d7f0-6d14-457b-9f3a-eff04efc4e2a/volumes" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.169035 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv"] Jun 13 06:00:00 crc kubenswrapper[4894]: E0613 06:00:00.170356 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="extract-content" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.170382 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="extract-content" Jun 13 06:00:00 crc kubenswrapper[4894]: E0613 06:00:00.170419 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="registry-server" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.170432 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="registry-server" Jun 13 06:00:00 crc kubenswrapper[4894]: E0613 06:00:00.170455 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="extract-utilities" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.170471 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="extract-utilities" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.170839 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e2d7f0-6d14-457b-9f3a-eff04efc4e2a" containerName="registry-server" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.171926 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.178981 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.179182 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.199957 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv"] Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.310426 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnz6t\" (UniqueName: \"kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.310830 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.311022 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.412539 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.412693 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnz6t\" (UniqueName: \"kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.412723 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.413466 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.429966 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.445482 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnz6t\" (UniqueName: \"kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t\") pod \"collect-profiles-29163240-q9mrv\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.511729 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:00 crc kubenswrapper[4894]: I0613 06:00:00.996027 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv"] Jun 13 06:00:01 crc kubenswrapper[4894]: I0613 06:00:01.666504 4894 generic.go:334] "Generic (PLEG): container finished" podID="7b225e62-b5fb-46c9-8744-f321f84ab343" containerID="3d6eba304039eee564c76b5b50b09f94016d8af61baffae56d18b94f896b4c18" exitCode=0 Jun 13 06:00:01 crc kubenswrapper[4894]: I0613 06:00:01.666929 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" event={"ID":"7b225e62-b5fb-46c9-8744-f321f84ab343","Type":"ContainerDied","Data":"3d6eba304039eee564c76b5b50b09f94016d8af61baffae56d18b94f896b4c18"} Jun 13 06:00:01 crc kubenswrapper[4894]: I0613 06:00:01.667004 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" event={"ID":"7b225e62-b5fb-46c9-8744-f321f84ab343","Type":"ContainerStarted","Data":"82f056e21d34b51a547692dac76938c23a6fc48472e6db1df26de10049171cbd"} Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.071604 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-2vgg8"] Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.077116 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.251373 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.251446 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rpc\" (UniqueName: \"kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.352985 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rpc\" (UniqueName: \"kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.353208 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.353331 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.384967 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rpc\" (UniqueName: \"kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc\") pod \"crc-debug-2vgg8\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.407363 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-2vgg8" Jun 13 06:00:02 crc kubenswrapper[4894]: W0613 06:00:02.466917 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd6e1350_7c07_4112_aa4d_7feb84591084.slice/crio-88644eee5362620fe227aa63e2b1e69587b444cee71b1ab74192a11e934a6f59 WatchSource:0}: Error finding container 88644eee5362620fe227aa63e2b1e69587b444cee71b1ab74192a11e934a6f59: Status 404 returned error can't find the container with id 88644eee5362620fe227aa63e2b1e69587b444cee71b1ab74192a11e934a6f59 Jun 13 06:00:02 crc kubenswrapper[4894]: I0613 06:00:02.675800 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-2vgg8" event={"ID":"cd6e1350-7c07-4112-aa4d-7feb84591084","Type":"ContainerStarted","Data":"88644eee5362620fe227aa63e2b1e69587b444cee71b1ab74192a11e934a6f59"} Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.020929 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.170917 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume\") pod \"7b225e62-b5fb-46c9-8744-f321f84ab343\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.171093 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnz6t\" (UniqueName: \"kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t\") pod \"7b225e62-b5fb-46c9-8744-f321f84ab343\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.171262 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume\") pod \"7b225e62-b5fb-46c9-8744-f321f84ab343\" (UID: \"7b225e62-b5fb-46c9-8744-f321f84ab343\") " Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.172710 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b225e62-b5fb-46c9-8744-f321f84ab343" (UID: "7b225e62-b5fb-46c9-8744-f321f84ab343"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.192282 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b225e62-b5fb-46c9-8744-f321f84ab343" (UID: "7b225e62-b5fb-46c9-8744-f321f84ab343"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.192435 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t" (OuterVolumeSpecName: "kube-api-access-lnz6t") pod "7b225e62-b5fb-46c9-8744-f321f84ab343" (UID: "7b225e62-b5fb-46c9-8744-f321f84ab343"). InnerVolumeSpecName "kube-api-access-lnz6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.273138 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnz6t\" (UniqueName: \"kubernetes.io/projected/7b225e62-b5fb-46c9-8744-f321f84ab343-kube-api-access-lnz6t\") on node \"crc\" DevicePath \"\"" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.273170 4894 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b225e62-b5fb-46c9-8744-f321f84ab343-config-volume\") on node \"crc\" DevicePath \"\"" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.273180 4894 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b225e62-b5fb-46c9-8744-f321f84ab343-secret-volume\") on node \"crc\" DevicePath \"\"" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.686916 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" event={"ID":"7b225e62-b5fb-46c9-8744-f321f84ab343","Type":"ContainerDied","Data":"82f056e21d34b51a547692dac76938c23a6fc48472e6db1df26de10049171cbd"} Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.687197 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82f056e21d34b51a547692dac76938c23a6fc48472e6db1df26de10049171cbd" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.687251 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29163240-q9mrv" Jun 13 06:00:03 crc kubenswrapper[4894]: I0613 06:00:03.691582 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-2vgg8" event={"ID":"cd6e1350-7c07-4112-aa4d-7feb84591084","Type":"ContainerStarted","Data":"ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c"} Jun 13 06:00:04 crc kubenswrapper[4894]: I0613 06:00:04.073490 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-2vgg8" podStartSLOduration=2.073463355 podStartE2EDuration="2.073463355s" podCreationTimestamp="2025-06-13 06:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:00:03.71305023 +0000 UTC m=+4162.159297693" watchObservedRunningTime="2025-06-13 06:00:04.073463355 +0000 UTC m=+4162.519710858" Jun 13 06:00:04 crc kubenswrapper[4894]: I0613 06:00:04.136487 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m"] Jun 13 06:00:04 crc kubenswrapper[4894]: I0613 06:00:04.145855 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29163195-zgm7m"] Jun 13 06:00:04 crc kubenswrapper[4894]: I0613 06:00:04.293430 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b28a50d-2d30-4225-aa88-54fbdaf4a48a" path="/var/lib/kubelet/pods/0b28a50d-2d30-4225-aa88-54fbdaf4a48a/volumes" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.044108 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-2vgg8"] Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.045542 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-2vgg8" podUID="cd6e1350-7c07-4112-aa4d-7feb84591084" containerName="container-00" containerID="cri-o://ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c" gracePeriod=2 Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.051410 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-2vgg8"] Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.183305 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-2vgg8" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.329108 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host\") pod \"cd6e1350-7c07-4112-aa4d-7feb84591084\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.329225 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host" (OuterVolumeSpecName: "host") pod "cd6e1350-7c07-4112-aa4d-7feb84591084" (UID: "cd6e1350-7c07-4112-aa4d-7feb84591084"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.329516 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8rpc\" (UniqueName: \"kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc\") pod \"cd6e1350-7c07-4112-aa4d-7feb84591084\" (UID: \"cd6e1350-7c07-4112-aa4d-7feb84591084\") " Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.330402 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd6e1350-7c07-4112-aa4d-7feb84591084-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.336959 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc" (OuterVolumeSpecName: "kube-api-access-b8rpc") pod "cd6e1350-7c07-4112-aa4d-7feb84591084" (UID: "cd6e1350-7c07-4112-aa4d-7feb84591084"). InnerVolumeSpecName "kube-api-access-b8rpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.431769 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8rpc\" (UniqueName: \"kubernetes.io/projected/cd6e1350-7c07-4112-aa4d-7feb84591084-kube-api-access-b8rpc\") on node \"crc\" DevicePath \"\"" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.811233 4894 generic.go:334] "Generic (PLEG): container finished" podID="cd6e1350-7c07-4112-aa4d-7feb84591084" containerID="ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c" exitCode=0 Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.811306 4894 scope.go:117] "RemoveContainer" containerID="ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.811458 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-2vgg8" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.841151 4894 scope.go:117] "RemoveContainer" containerID="ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c" Jun 13 06:00:13 crc kubenswrapper[4894]: E0613 06:00:13.841605 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c\": container with ID starting with ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c not found: ID does not exist" containerID="ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c" Jun 13 06:00:13 crc kubenswrapper[4894]: I0613 06:00:13.841690 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c"} err="failed to get container status \"ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c\": rpc error: code = NotFound desc = could not find container \"ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c\": container with ID starting with ef8e1d502159914d6f3458eb7e8463ba274d81d67ca88399a0395abd17846a1c not found: ID does not exist" Jun 13 06:00:14 crc kubenswrapper[4894]: I0613 06:00:14.293273 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6e1350-7c07-4112-aa4d-7feb84591084" path="/var/lib/kubelet/pods/cd6e1350-7c07-4112-aa4d-7feb84591084/volumes" Jun 13 06:00:18 crc kubenswrapper[4894]: I0613 06:00:18.935404 4894 scope.go:117] "RemoveContainer" containerID="ce660fb0cb608d8b8ded9837fa75c9c01491abf5532441da05fce07c7b0fdb89" Jun 13 06:00:26 crc kubenswrapper[4894]: I0613 06:00:26.235916 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 06:00:26 crc kubenswrapper[4894]: I0613 06:00:26.237395 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 06:00:56 crc kubenswrapper[4894]: I0613 06:00:56.235984 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 06:00:56 crc kubenswrapper[4894]: I0613 06:00:56.236475 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.191955 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29163241-m92x7"] Jun 13 06:01:00 crc kubenswrapper[4894]: E0613 06:01:00.195382 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6e1350-7c07-4112-aa4d-7feb84591084" containerName="container-00" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.195443 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6e1350-7c07-4112-aa4d-7feb84591084" containerName="container-00" Jun 13 06:01:00 crc kubenswrapper[4894]: E0613 06:01:00.195488 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b225e62-b5fb-46c9-8744-f321f84ab343" containerName="collect-profiles" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.195500 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b225e62-b5fb-46c9-8744-f321f84ab343" containerName="collect-profiles" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.195865 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd6e1350-7c07-4112-aa4d-7feb84591084" containerName="container-00" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.195887 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b225e62-b5fb-46c9-8744-f321f84ab343" containerName="collect-profiles" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.197001 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.213155 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29163241-m92x7"] Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.359582 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ht8b\" (UniqueName: \"kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.359633 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.359721 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.360082 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.463109 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.465238 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.467270 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ht8b\" (UniqueName: \"kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.467388 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.469400 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.477608 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.477606 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.493103 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ht8b\" (UniqueName: \"kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b\") pod \"keystone-cron-29163241-m92x7\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.532806 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:00 crc kubenswrapper[4894]: I0613 06:01:00.991785 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29163241-m92x7"] Jun 13 06:01:01 crc kubenswrapper[4894]: W0613 06:01:01.006870 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55ff65a8_628e_4770_b12b_f2d50722acd2.slice/crio-edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce WatchSource:0}: Error finding container edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce: Status 404 returned error can't find the container with id edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.315354 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29163241-m92x7" event={"ID":"55ff65a8-628e-4770-b12b-f2d50722acd2","Type":"ContainerStarted","Data":"ab1cafea58e64e2fc0ca556bcba5bef9affddd86cbdac9223df57f7c8e648a06"} Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.315417 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29163241-m92x7" event={"ID":"55ff65a8-628e-4770-b12b-f2d50722acd2","Type":"ContainerStarted","Data":"edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce"} Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.340214 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29163241-m92x7" podStartSLOduration=1.3401919119999999 podStartE2EDuration="1.340191912s" podCreationTimestamp="2025-06-13 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:01:01.339583255 +0000 UTC m=+4219.785830758" watchObservedRunningTime="2025-06-13 06:01:01.340191912 +0000 UTC m=+4219.786439385" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.502581 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-ggcpr"] Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.505395 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.702914 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zsz\" (UniqueName: \"kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.704536 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.806857 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.806947 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zsz\" (UniqueName: \"kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.811865 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:01 crc kubenswrapper[4894]: I0613 06:01:01.853430 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zsz\" (UniqueName: \"kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz\") pod \"crc-debug-ggcpr\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " pod="openstack/crc-debug-ggcpr" Jun 13 06:01:02 crc kubenswrapper[4894]: I0613 06:01:02.153297 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-ggcpr" Jun 13 06:01:02 crc kubenswrapper[4894]: W0613 06:01:02.192442 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51b81c4e_2e8a_4d34_a365_f188eed1ea4b.slice/crio-cf3afeaafdc37dbc944a5845b43d0dcf9322bd9875c2126a8820c5365aeb7483 WatchSource:0}: Error finding container cf3afeaafdc37dbc944a5845b43d0dcf9322bd9875c2126a8820c5365aeb7483: Status 404 returned error can't find the container with id cf3afeaafdc37dbc944a5845b43d0dcf9322bd9875c2126a8820c5365aeb7483 Jun 13 06:01:02 crc kubenswrapper[4894]: I0613 06:01:02.326713 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-ggcpr" event={"ID":"51b81c4e-2e8a-4d34-a365-f188eed1ea4b","Type":"ContainerStarted","Data":"cf3afeaafdc37dbc944a5845b43d0dcf9322bd9875c2126a8820c5365aeb7483"} Jun 13 06:01:03 crc kubenswrapper[4894]: I0613 06:01:03.339075 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-ggcpr" event={"ID":"51b81c4e-2e8a-4d34-a365-f188eed1ea4b","Type":"ContainerStarted","Data":"05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907"} Jun 13 06:01:03 crc kubenswrapper[4894]: I0613 06:01:03.360332 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-ggcpr" podStartSLOduration=2.360315506 podStartE2EDuration="2.360315506s" podCreationTimestamp="2025-06-13 06:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:01:03.352132525 +0000 UTC m=+4221.798380018" watchObservedRunningTime="2025-06-13 06:01:03.360315506 +0000 UTC m=+4221.806562969" Jun 13 06:01:04 crc kubenswrapper[4894]: I0613 06:01:04.353589 4894 generic.go:334] "Generic (PLEG): container finished" podID="55ff65a8-628e-4770-b12b-f2d50722acd2" containerID="ab1cafea58e64e2fc0ca556bcba5bef9affddd86cbdac9223df57f7c8e648a06" exitCode=0 Jun 13 06:01:04 crc kubenswrapper[4894]: I0613 06:01:04.353639 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29163241-m92x7" event={"ID":"55ff65a8-628e-4770-b12b-f2d50722acd2","Type":"ContainerDied","Data":"ab1cafea58e64e2fc0ca556bcba5bef9affddd86cbdac9223df57f7c8e648a06"} Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.729558 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.800957 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys\") pod \"55ff65a8-628e-4770-b12b-f2d50722acd2\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.801425 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data\") pod \"55ff65a8-628e-4770-b12b-f2d50722acd2\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.801673 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle\") pod \"55ff65a8-628e-4770-b12b-f2d50722acd2\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.801964 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ht8b\" (UniqueName: \"kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b\") pod \"55ff65a8-628e-4770-b12b-f2d50722acd2\" (UID: \"55ff65a8-628e-4770-b12b-f2d50722acd2\") " Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.807750 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b" (OuterVolumeSpecName: "kube-api-access-8ht8b") pod "55ff65a8-628e-4770-b12b-f2d50722acd2" (UID: "55ff65a8-628e-4770-b12b-f2d50722acd2"). InnerVolumeSpecName "kube-api-access-8ht8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.809589 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55ff65a8-628e-4770-b12b-f2d50722acd2" (UID: "55ff65a8-628e-4770-b12b-f2d50722acd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.838616 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55ff65a8-628e-4770-b12b-f2d50722acd2" (UID: "55ff65a8-628e-4770-b12b-f2d50722acd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.889036 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data" (OuterVolumeSpecName: "config-data") pod "55ff65a8-628e-4770-b12b-f2d50722acd2" (UID: "55ff65a8-628e-4770-b12b-f2d50722acd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.904441 4894 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.904473 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.904486 4894 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ff65a8-628e-4770-b12b-f2d50722acd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:05 crc kubenswrapper[4894]: I0613 06:01:05.904503 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ht8b\" (UniqueName: \"kubernetes.io/projected/55ff65a8-628e-4770-b12b-f2d50722acd2-kube-api-access-8ht8b\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:06 crc kubenswrapper[4894]: I0613 06:01:06.378140 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29163241-m92x7" event={"ID":"55ff65a8-628e-4770-b12b-f2d50722acd2","Type":"ContainerDied","Data":"edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce"} Jun 13 06:01:06 crc kubenswrapper[4894]: I0613 06:01:06.378206 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc60c5d7ce1594c13c2698ca78c35e2e5491fe7cef46e4d414b6ed07e5058ce" Jun 13 06:01:06 crc kubenswrapper[4894]: I0613 06:01:06.378269 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29163241-m92x7" Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.743037 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-ggcpr"] Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.743974 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-ggcpr" podUID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" containerName="container-00" containerID="cri-o://05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907" gracePeriod=2 Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.751355 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-ggcpr"] Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.826357 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-ggcpr" Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.913862 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4zsz\" (UniqueName: \"kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz\") pod \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.914155 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host\") pod \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\" (UID: \"51b81c4e-2e8a-4d34-a365-f188eed1ea4b\") " Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.914972 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host" (OuterVolumeSpecName: "host") pod "51b81c4e-2e8a-4d34-a365-f188eed1ea4b" (UID: "51b81c4e-2e8a-4d34-a365-f188eed1ea4b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:01:12 crc kubenswrapper[4894]: I0613 06:01:12.932935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz" (OuterVolumeSpecName: "kube-api-access-r4zsz") pod "51b81c4e-2e8a-4d34-a365-f188eed1ea4b" (UID: "51b81c4e-2e8a-4d34-a365-f188eed1ea4b"). InnerVolumeSpecName "kube-api-access-r4zsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.021178 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.021247 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4zsz\" (UniqueName: \"kubernetes.io/projected/51b81c4e-2e8a-4d34-a365-f188eed1ea4b-kube-api-access-r4zsz\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.473034 4894 generic.go:334] "Generic (PLEG): container finished" podID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" containerID="05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907" exitCode=0 Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.473108 4894 scope.go:117] "RemoveContainer" containerID="05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.473136 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-ggcpr" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.510189 4894 scope.go:117] "RemoveContainer" containerID="05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907" Jun 13 06:01:13 crc kubenswrapper[4894]: E0613 06:01:13.511269 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907\": container with ID starting with 05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907 not found: ID does not exist" containerID="05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907" Jun 13 06:01:13 crc kubenswrapper[4894]: I0613 06:01:13.511320 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907"} err="failed to get container status \"05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907\": rpc error: code = NotFound desc = could not find container \"05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907\": container with ID starting with 05bbd800b7aa6ce757506b2a27f2335027a33d8ff1d165974840f88a32b37907 not found: ID does not exist" Jun 13 06:01:14 crc kubenswrapper[4894]: I0613 06:01:14.294897 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" path="/var/lib/kubelet/pods/51b81c4e-2e8a-4d34-a365-f188eed1ea4b/volumes" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.236336 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.236867 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.236918 4894 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.237642 4894 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0"} pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.237728 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" containerID="cri-o://65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" gracePeriod=600 Jun 13 06:01:26 crc kubenswrapper[4894]: E0613 06:01:26.377396 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.634764 4894 generic.go:334] "Generic (PLEG): container finished" podID="192fcf92-25d2-4664-bb9d-8857929dd084" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" exitCode=0 Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.634846 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerDied","Data":"65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0"} Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.635152 4894 scope.go:117] "RemoveContainer" containerID="a01acfc542ad9f1f6a03b71cff196608ac754a0f3b7057651496a2cf876a6f6c" Jun 13 06:01:26 crc kubenswrapper[4894]: I0613 06:01:26.635883 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:01:26 crc kubenswrapper[4894]: E0613 06:01:26.636196 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:01:27 crc kubenswrapper[4894]: I0613 06:01:27.646060 4894 generic.go:334] "Generic (PLEG): container finished" podID="357d4a2c-de1e-47c2-8602-9b717b898330" containerID="45a1fa75515fe9f6eaa03e92778e99917ba1c3855c5d5252e05050adf1b9d6ba" exitCode=0 Jun 13 06:01:27 crc kubenswrapper[4894]: I0613 06:01:27.646190 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"357d4a2c-de1e-47c2-8602-9b717b898330","Type":"ContainerDied","Data":"45a1fa75515fe9f6eaa03e92778e99917ba1c3855c5d5252e05050adf1b9d6ba"} Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.091921 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.183141 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.183227 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184005 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data" (OuterVolumeSpecName: "config-data") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184298 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184351 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xv5k\" (UniqueName: \"kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184388 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184423 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184498 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184569 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.184594 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary\") pod \"357d4a2c-de1e-47c2-8602-9b717b898330\" (UID: \"357d4a2c-de1e-47c2-8602-9b717b898330\") " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.185025 4894 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-config-data\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.185388 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.188325 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.189838 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k" (OuterVolumeSpecName: "kube-api-access-8xv5k") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "kube-api-access-8xv5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.191348 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.215424 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.220344 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.221886 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.235120 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "357d4a2c-de1e-47c2-8602-9b717b898330" (UID: "357d4a2c-de1e-47c2-8602-9b717b898330"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.286545 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xv5k\" (UniqueName: \"kubernetes.io/projected/357d4a2c-de1e-47c2-8602-9b717b898330-kube-api-access-8xv5k\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.286579 4894 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ca-certs\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.286589 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.286600 4894 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.287277 4894 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.287298 4894 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/357d4a2c-de1e-47c2-8602-9b717b898330-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.287310 4894 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.287320 4894 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/357d4a2c-de1e-47c2-8602-9b717b898330-ssh-key\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.316080 4894 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.390506 4894 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.668365 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"357d4a2c-de1e-47c2-8602-9b717b898330","Type":"ContainerDied","Data":"6c6c7261988d3a873779229864ef15356847a4e40cd30bb16691c1eea13e4969"} Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.668907 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c6c7261988d3a873779229864ef15356847a4e40cd30bb16691c1eea13e4969" Jun 13 06:01:29 crc kubenswrapper[4894]: I0613 06:01:29.668482 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.255528 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jun 13 06:01:32 crc kubenswrapper[4894]: E0613 06:01:32.256575 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ff65a8-628e-4770-b12b-f2d50722acd2" containerName="keystone-cron" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256591 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ff65a8-628e-4770-b12b-f2d50722acd2" containerName="keystone-cron" Jun 13 06:01:32 crc kubenswrapper[4894]: E0613 06:01:32.256609 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357d4a2c-de1e-47c2-8602-9b717b898330" containerName="tempest-tests-tempest-tests-runner" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256617 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="357d4a2c-de1e-47c2-8602-9b717b898330" containerName="tempest-tests-tempest-tests-runner" Jun 13 06:01:32 crc kubenswrapper[4894]: E0613 06:01:32.256630 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" containerName="container-00" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256637 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" containerName="container-00" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256844 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="357d4a2c-de1e-47c2-8602-9b717b898330" containerName="tempest-tests-tempest-tests-runner" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256882 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b81c4e-2e8a-4d34-a365-f188eed1ea4b" containerName="container-00" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.256894 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ff65a8-628e-4770-b12b-f2d50722acd2" containerName="keystone-cron" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.257542 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.259235 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gm2wn" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.269487 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.351840 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.351987 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r76zt\" (UniqueName: \"kubernetes.io/projected/9a0f3544-2f43-428d-ab9a-04afce67765a-kube-api-access-r76zt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.453647 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.453925 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r76zt\" (UniqueName: \"kubernetes.io/projected/9a0f3544-2f43-428d-ab9a-04afce67765a-kube-api-access-r76zt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:32 crc kubenswrapper[4894]: I0613 06:01:32.454074 4894 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:33 crc kubenswrapper[4894]: I0613 06:01:33.105466 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r76zt\" (UniqueName: \"kubernetes.io/projected/9a0f3544-2f43-428d-ab9a-04afce67765a-kube-api-access-r76zt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:33 crc kubenswrapper[4894]: I0613 06:01:33.264757 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a0f3544-2f43-428d-ab9a-04afce67765a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:33 crc kubenswrapper[4894]: I0613 06:01:33.479728 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jun 13 06:01:34 crc kubenswrapper[4894]: I0613 06:01:34.180460 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jun 13 06:01:34 crc kubenswrapper[4894]: W0613 06:01:34.184235 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a0f3544_2f43_428d_ab9a_04afce67765a.slice/crio-322147035f779437237a8ab04e1bcc5a1ead2455464fcc583e95943ed5c9c3f7 WatchSource:0}: Error finding container 322147035f779437237a8ab04e1bcc5a1ead2455464fcc583e95943ed5c9c3f7: Status 404 returned error can't find the container with id 322147035f779437237a8ab04e1bcc5a1ead2455464fcc583e95943ed5c9c3f7 Jun 13 06:01:34 crc kubenswrapper[4894]: I0613 06:01:34.716844 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a0f3544-2f43-428d-ab9a-04afce67765a","Type":"ContainerStarted","Data":"322147035f779437237a8ab04e1bcc5a1ead2455464fcc583e95943ed5c9c3f7"} Jun 13 06:01:35 crc kubenswrapper[4894]: I0613 06:01:35.728173 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a0f3544-2f43-428d-ab9a-04afce67765a","Type":"ContainerStarted","Data":"a8a027fca323b8a22804a17edaabe17dfdbfc0eaa73d4960c5607cdda0dfb9a0"} Jun 13 06:01:35 crc kubenswrapper[4894]: I0613 06:01:35.742590 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.612544314 podStartE2EDuration="3.742572825s" podCreationTimestamp="2025-06-13 06:01:32 +0000 UTC" firstStartedPulling="2025-06-13 06:01:34.189044227 +0000 UTC m=+4252.635291690" lastFinishedPulling="2025-06-13 06:01:35.319072748 +0000 UTC m=+4253.765320201" observedRunningTime="2025-06-13 06:01:35.741562497 +0000 UTC m=+4254.187810000" watchObservedRunningTime="2025-06-13 06:01:35.742572825 +0000 UTC m=+4254.188820288" Jun 13 06:01:40 crc kubenswrapper[4894]: I0613 06:01:40.277477 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:01:40 crc kubenswrapper[4894]: E0613 06:01:40.278560 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:01:52 crc kubenswrapper[4894]: I0613 06:01:52.284084 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:01:52 crc kubenswrapper[4894]: E0613 06:01:52.285253 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.160665 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-6zqhq"] Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.165814 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.196508 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4g67\" (UniqueName: \"kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.196919 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.298361 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4g67\" (UniqueName: \"kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.298490 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.298650 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.318954 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4g67\" (UniqueName: \"kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67\") pod \"crc-debug-6zqhq\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " pod="openstack/crc-debug-6zqhq" Jun 13 06:02:02 crc kubenswrapper[4894]: I0613 06:02:02.509622 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6zqhq" Jun 13 06:02:03 crc kubenswrapper[4894]: I0613 06:02:03.030511 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-6zqhq" event={"ID":"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9","Type":"ContainerStarted","Data":"1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e"} Jun 13 06:02:03 crc kubenswrapper[4894]: I0613 06:02:03.030927 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-6zqhq" event={"ID":"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9","Type":"ContainerStarted","Data":"7412e4fa7c5f3c96eb3dcbbff8a605749bec3d0efd23c3d1b71388d673d87dcc"} Jun 13 06:02:03 crc kubenswrapper[4894]: I0613 06:02:03.059594 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-6zqhq" podStartSLOduration=1.059568642 podStartE2EDuration="1.059568642s" podCreationTimestamp="2025-06-13 06:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:02:03.053000686 +0000 UTC m=+4281.499248139" watchObservedRunningTime="2025-06-13 06:02:03.059568642 +0000 UTC m=+4281.505816145" Jun 13 06:02:06 crc kubenswrapper[4894]: I0613 06:02:06.281893 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:02:06 crc kubenswrapper[4894]: E0613 06:02:06.283201 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.301258 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6sz/must-gather-8x9kr"] Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.302788 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.305685 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vp6sz"/"openshift-service-ca.crt" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.306850 4894 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vp6sz"/"default-dockercfg-cwbpc" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.310333 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6sz/must-gather-8x9kr"] Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.316858 4894 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vp6sz"/"kube-root-ca.crt" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.477290 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.477355 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mb4d\" (UniqueName: \"kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.578640 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.578762 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mb4d\" (UniqueName: \"kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.579717 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.602438 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mb4d\" (UniqueName: \"kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d\") pod \"must-gather-8x9kr\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:07 crc kubenswrapper[4894]: I0613 06:02:07.621054 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:02:08 crc kubenswrapper[4894]: I0613 06:02:08.129611 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6sz/must-gather-8x9kr"] Jun 13 06:02:09 crc kubenswrapper[4894]: I0613 06:02:09.099224 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" event={"ID":"4a5c4b72-b727-4db0-b145-e14f5f9f3087","Type":"ContainerStarted","Data":"376bb9efa19e4cc75af6da13dfef0253eaad5a8ef7dc1b170b030abf7be9c78e"} Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.754188 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-6zqhq"] Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.754921 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-6zqhq" podUID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" containerName="container-00" containerID="cri-o://1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e" gracePeriod=2 Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.766960 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-6zqhq"] Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.929867 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6zqhq" Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.954766 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4g67\" (UniqueName: \"kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67\") pod \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.955003 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host\") pod \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\" (UID: \"4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9\") " Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.955074 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host" (OuterVolumeSpecName: "host") pod "4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" (UID: "4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.955740 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:02:14 crc kubenswrapper[4894]: I0613 06:02:14.961232 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67" (OuterVolumeSpecName: "kube-api-access-v4g67") pod "4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" (UID: "4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9"). InnerVolumeSpecName "kube-api-access-v4g67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.057726 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4g67\" (UniqueName: \"kubernetes.io/projected/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9-kube-api-access-v4g67\") on node \"crc\" DevicePath \"\"" Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.185604 4894 generic.go:334] "Generic (PLEG): container finished" podID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" containerID="1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e" exitCode=0 Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.185680 4894 scope.go:117] "RemoveContainer" containerID="1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e" Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.185784 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-6zqhq" Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.195425 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" event={"ID":"4a5c4b72-b727-4db0-b145-e14f5f9f3087","Type":"ContainerStarted","Data":"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b"} Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.225547 4894 scope.go:117] "RemoveContainer" containerID="1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e" Jun 13 06:02:15 crc kubenswrapper[4894]: E0613 06:02:15.226055 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e\": container with ID starting with 1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e not found: ID does not exist" containerID="1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e" Jun 13 06:02:15 crc kubenswrapper[4894]: I0613 06:02:15.226117 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e"} err="failed to get container status \"1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e\": rpc error: code = NotFound desc = could not find container \"1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e\": container with ID starting with 1d035f31440f10c5626f88f65a4fb7678a7bb2e75ff3577a184eee90b9194b9e not found: ID does not exist" Jun 13 06:02:16 crc kubenswrapper[4894]: I0613 06:02:16.210576 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" event={"ID":"4a5c4b72-b727-4db0-b145-e14f5f9f3087","Type":"ContainerStarted","Data":"f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc"} Jun 13 06:02:16 crc kubenswrapper[4894]: I0613 06:02:16.231492 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" podStartSLOduration=2.70937269 podStartE2EDuration="9.23147594s" podCreationTimestamp="2025-06-13 06:02:07 +0000 UTC" firstStartedPulling="2025-06-13 06:02:08.115245583 +0000 UTC m=+4286.561493046" lastFinishedPulling="2025-06-13 06:02:14.637348833 +0000 UTC m=+4293.083596296" observedRunningTime="2025-06-13 06:02:16.227272511 +0000 UTC m=+4294.673519984" watchObservedRunningTime="2025-06-13 06:02:16.23147594 +0000 UTC m=+4294.677723413" Jun 13 06:02:16 crc kubenswrapper[4894]: I0613 06:02:16.290234 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" path="/var/lib/kubelet/pods/4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9/volumes" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.545862 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-rr8p9"] Jun 13 06:02:19 crc kubenswrapper[4894]: E0613 06:02:19.546734 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" containerName="container-00" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.546753 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" containerName="container-00" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.546984 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e12b9bf-10cf-4dc7-ab4d-3ff42d9f8ac9" containerName="container-00" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.547780 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.656170 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.656478 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989t8\" (UniqueName: \"kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.758093 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.758171 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989t8\" (UniqueName: \"kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.758229 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.781249 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989t8\" (UniqueName: \"kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8\") pod \"crc-debug-rr8p9\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:19 crc kubenswrapper[4894]: I0613 06:02:19.863360 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:02:21 crc kubenswrapper[4894]: I0613 06:02:21.258038 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" event={"ID":"f3a3bd19-df18-4ddd-b547-eb5c5b552d33","Type":"ContainerStarted","Data":"b52d70a9957a518618e5da9f40f5f09559257cf24307cfe76632b31edd4bc4bd"} Jun 13 06:02:21 crc kubenswrapper[4894]: I0613 06:02:21.258523 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" event={"ID":"f3a3bd19-df18-4ddd-b547-eb5c5b552d33","Type":"ContainerStarted","Data":"a186419e2c98ad1108ef5059ec25e546536066f22f433df614e3d1c1e2bf2075"} Jun 13 06:02:21 crc kubenswrapper[4894]: I0613 06:02:21.273915 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" podStartSLOduration=2.2738849070000002 podStartE2EDuration="2.273884907s" podCreationTimestamp="2025-06-13 06:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:02:21.26690058 +0000 UTC m=+4299.713148043" watchObservedRunningTime="2025-06-13 06:02:21.273884907 +0000 UTC m=+4299.720132370" Jun 13 06:02:21 crc kubenswrapper[4894]: I0613 06:02:21.276882 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:02:21 crc kubenswrapper[4894]: E0613 06:02:21.277113 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:02:36 crc kubenswrapper[4894]: I0613 06:02:36.278083 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:02:36 crc kubenswrapper[4894]: E0613 06:02:36.278741 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:02:50 crc kubenswrapper[4894]: I0613 06:02:50.276309 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:02:50 crc kubenswrapper[4894]: E0613 06:02:50.278191 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.201113 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-4ftm6"] Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.203807 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.279019 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rd9\" (UniqueName: \"kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.279664 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.381855 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rd9\" (UniqueName: \"kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.381897 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.381992 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.409193 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rd9\" (UniqueName: \"kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9\") pod \"crc-debug-4ftm6\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.543040 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4ftm6" Jun 13 06:03:02 crc kubenswrapper[4894]: I0613 06:03:02.645229 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4ftm6" event={"ID":"ff278757-7ac8-4930-a3dd-7b8e8b9a9321","Type":"ContainerStarted","Data":"6fd3d9022d1f8c28ff7e7eb18f3ee657557e1b481bef499bc9d2f130d05ce0a0"} Jun 13 06:03:03 crc kubenswrapper[4894]: I0613 06:03:03.667024 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-4ftm6" event={"ID":"ff278757-7ac8-4930-a3dd-7b8e8b9a9321","Type":"ContainerStarted","Data":"fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f"} Jun 13 06:03:03 crc kubenswrapper[4894]: I0613 06:03:03.694074 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-4ftm6" podStartSLOduration=1.6940591440000001 podStartE2EDuration="1.694059144s" podCreationTimestamp="2025-06-13 06:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:03:03.686412118 +0000 UTC m=+4342.132659581" watchObservedRunningTime="2025-06-13 06:03:03.694059144 +0000 UTC m=+4342.140306607" Jun 13 06:03:04 crc kubenswrapper[4894]: I0613 06:03:04.277340 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:03:04 crc kubenswrapper[4894]: E0613 06:03:04.278113 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.503843 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-4ftm6"] Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.504698 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-4ftm6" podUID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" containerName="container-00" containerID="cri-o://fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f" gracePeriod=2 Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.510395 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-4ftm6"] Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.571644 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4ftm6" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.632111 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86rd9\" (UniqueName: \"kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9\") pod \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.632336 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host\") pod \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\" (UID: \"ff278757-7ac8-4930-a3dd-7b8e8b9a9321\") " Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.632468 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host" (OuterVolumeSpecName: "host") pod "ff278757-7ac8-4930-a3dd-7b8e8b9a9321" (UID: "ff278757-7ac8-4930-a3dd-7b8e8b9a9321"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.632826 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.642828 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9" (OuterVolumeSpecName: "kube-api-access-86rd9") pod "ff278757-7ac8-4930-a3dd-7b8e8b9a9321" (UID: "ff278757-7ac8-4930-a3dd-7b8e8b9a9321"). InnerVolumeSpecName "kube-api-access-86rd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.734527 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86rd9\" (UniqueName: \"kubernetes.io/projected/ff278757-7ac8-4930-a3dd-7b8e8b9a9321-kube-api-access-86rd9\") on node \"crc\" DevicePath \"\"" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.764256 4894 generic.go:334] "Generic (PLEG): container finished" podID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" containerID="fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f" exitCode=0 Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.764314 4894 scope.go:117] "RemoveContainer" containerID="fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.764324 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-4ftm6" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.786813 4894 scope.go:117] "RemoveContainer" containerID="fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f" Jun 13 06:03:13 crc kubenswrapper[4894]: E0613 06:03:13.787348 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f\": container with ID starting with fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f not found: ID does not exist" containerID="fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f" Jun 13 06:03:13 crc kubenswrapper[4894]: I0613 06:03:13.787401 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f"} err="failed to get container status \"fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f\": rpc error: code = NotFound desc = could not find container \"fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f\": container with ID starting with fa74f807368aca3108754c646778d159814d675f0fc0836a565d496967c8ae6f not found: ID does not exist" Jun 13 06:03:13 crc kubenswrapper[4894]: E0613 06:03:13.876756 4894 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff278757_7ac8_4930_a3dd_7b8e8b9a9321.slice/crio-6fd3d9022d1f8c28ff7e7eb18f3ee657557e1b481bef499bc9d2f130d05ce0a0\": RecentStats: unable to find data in memory cache]" Jun 13 06:03:14 crc kubenswrapper[4894]: I0613 06:03:14.290433 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" path="/var/lib/kubelet/pods/ff278757-7ac8-4930-a3dd-7b8e8b9a9321/volumes" Jun 13 06:03:15 crc kubenswrapper[4894]: I0613 06:03:15.277016 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:03:15 crc kubenswrapper[4894]: E0613 06:03:15.277705 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.032363 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:26 crc kubenswrapper[4894]: E0613 06:03:26.033272 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" containerName="container-00" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.033288 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" containerName="container-00" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.033459 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff278757-7ac8-4930-a3dd-7b8e8b9a9321" containerName="container-00" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.034692 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.045391 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.166357 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.166461 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.166501 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvznz\" (UniqueName: \"kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.268079 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.268308 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.268387 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvznz\" (UniqueName: \"kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.269538 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.269739 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.277364 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:03:26 crc kubenswrapper[4894]: E0613 06:03:26.277696 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.292306 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvznz\" (UniqueName: \"kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz\") pod \"redhat-marketplace-28mv4\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.364908 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.833782 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:26 crc kubenswrapper[4894]: I0613 06:03:26.864896 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerStarted","Data":"709dfa8ecefd3ffdb3b44c6105e0e21e80e478ce56ab4f0ef7192a76298af0e4"} Jun 13 06:03:27 crc kubenswrapper[4894]: I0613 06:03:27.879939 4894 generic.go:334] "Generic (PLEG): container finished" podID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerID="83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4" exitCode=0 Jun 13 06:03:27 crc kubenswrapper[4894]: I0613 06:03:27.880029 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerDied","Data":"83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4"} Jun 13 06:03:28 crc kubenswrapper[4894]: I0613 06:03:28.896018 4894 generic.go:334] "Generic (PLEG): container finished" podID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerID="9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9" exitCode=0 Jun 13 06:03:28 crc kubenswrapper[4894]: I0613 06:03:28.896081 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerDied","Data":"9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9"} Jun 13 06:03:30 crc kubenswrapper[4894]: I0613 06:03:30.930980 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerStarted","Data":"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95"} Jun 13 06:03:30 crc kubenswrapper[4894]: I0613 06:03:30.966164 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28mv4" podStartSLOduration=3.498343633 podStartE2EDuration="4.96614542s" podCreationTimestamp="2025-06-13 06:03:26 +0000 UTC" firstStartedPulling="2025-06-13 06:03:27.883684477 +0000 UTC m=+4366.329931940" lastFinishedPulling="2025-06-13 06:03:29.351486264 +0000 UTC m=+4367.797733727" observedRunningTime="2025-06-13 06:03:30.959779481 +0000 UTC m=+4369.406026974" watchObservedRunningTime="2025-06-13 06:03:30.96614542 +0000 UTC m=+4369.412392883" Jun 13 06:03:36 crc kubenswrapper[4894]: I0613 06:03:36.365748 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:36 crc kubenswrapper[4894]: I0613 06:03:36.368962 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:36 crc kubenswrapper[4894]: I0613 06:03:36.428335 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:37 crc kubenswrapper[4894]: I0613 06:03:37.064527 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:37 crc kubenswrapper[4894]: I0613 06:03:37.119743 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.008266 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28mv4" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="registry-server" containerID="cri-o://95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95" gracePeriod=2 Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.276684 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:03:39 crc kubenswrapper[4894]: E0613 06:03:39.276946 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.409777 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.553266 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content\") pod \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.553749 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvznz\" (UniqueName: \"kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz\") pod \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.553854 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities\") pod \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\" (UID: \"d7ab5faf-81cf-474f-9ec9-f6111029b2e9\") " Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.554547 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities" (OuterVolumeSpecName: "utilities") pod "d7ab5faf-81cf-474f-9ec9-f6111029b2e9" (UID: "d7ab5faf-81cf-474f-9ec9-f6111029b2e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.565184 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7ab5faf-81cf-474f-9ec9-f6111029b2e9" (UID: "d7ab5faf-81cf-474f-9ec9-f6111029b2e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.566955 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz" (OuterVolumeSpecName: "kube-api-access-pvznz") pod "d7ab5faf-81cf-474f-9ec9-f6111029b2e9" (UID: "d7ab5faf-81cf-474f-9ec9-f6111029b2e9"). InnerVolumeSpecName "kube-api-access-pvznz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.656759 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvznz\" (UniqueName: \"kubernetes.io/projected/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-kube-api-access-pvznz\") on node \"crc\" DevicePath \"\"" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.657052 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 06:03:39 crc kubenswrapper[4894]: I0613 06:03:39.657160 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ab5faf-81cf-474f-9ec9-f6111029b2e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.021957 4894 generic.go:334] "Generic (PLEG): container finished" podID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerID="95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95" exitCode=0 Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.022035 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28mv4" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.022060 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerDied","Data":"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95"} Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.022151 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28mv4" event={"ID":"d7ab5faf-81cf-474f-9ec9-f6111029b2e9","Type":"ContainerDied","Data":"709dfa8ecefd3ffdb3b44c6105e0e21e80e478ce56ab4f0ef7192a76298af0e4"} Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.022224 4894 scope.go:117] "RemoveContainer" containerID="95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.069667 4894 scope.go:117] "RemoveContainer" containerID="9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.095055 4894 scope.go:117] "RemoveContainer" containerID="83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.095483 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.123334 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28mv4"] Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.151401 4894 scope.go:117] "RemoveContainer" containerID="95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95" Jun 13 06:03:40 crc kubenswrapper[4894]: E0613 06:03:40.151921 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95\": container with ID starting with 95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95 not found: ID does not exist" containerID="95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.151947 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95"} err="failed to get container status \"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95\": rpc error: code = NotFound desc = could not find container \"95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95\": container with ID starting with 95b638fa649ae6fc29238974e0fbae55d37f37ca326e9baca2b9fe040a766c95 not found: ID does not exist" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.151968 4894 scope.go:117] "RemoveContainer" containerID="9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9" Jun 13 06:03:40 crc kubenswrapper[4894]: E0613 06:03:40.152301 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9\": container with ID starting with 9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9 not found: ID does not exist" containerID="9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.152321 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9"} err="failed to get container status \"9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9\": rpc error: code = NotFound desc = could not find container \"9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9\": container with ID starting with 9229d3c810120180654349737107dde739314c352faf47844fc6e1ca123429c9 not found: ID does not exist" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.152332 4894 scope.go:117] "RemoveContainer" containerID="83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4" Jun 13 06:03:40 crc kubenswrapper[4894]: E0613 06:03:40.152723 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4\": container with ID starting with 83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4 not found: ID does not exist" containerID="83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.152743 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4"} err="failed to get container status \"83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4\": rpc error: code = NotFound desc = could not find container \"83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4\": container with ID starting with 83dce26bfd7c8f1642a04321116cb7268171e6b07a2eccf8387ba7d96781f2f4 not found: ID does not exist" Jun 13 06:03:40 crc kubenswrapper[4894]: I0613 06:03:40.286010 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" path="/var/lib/kubelet/pods/d7ab5faf-81cf-474f-9ec9-f6111029b2e9/volumes" Jun 13 06:03:53 crc kubenswrapper[4894]: I0613 06:03:53.277163 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:03:53 crc kubenswrapper[4894]: E0613 06:03:53.277883 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.863092 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-9ks9k"] Jun 13 06:04:01 crc kubenswrapper[4894]: E0613 06:04:01.864457 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="extract-content" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.864481 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="extract-content" Jun 13 06:04:01 crc kubenswrapper[4894]: E0613 06:04:01.864536 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="registry-server" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.864550 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="registry-server" Jun 13 06:04:01 crc kubenswrapper[4894]: E0613 06:04:01.864577 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="extract-utilities" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.864593 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="extract-utilities" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.864951 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ab5faf-81cf-474f-9ec9-f6111029b2e9" containerName="registry-server" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.865970 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9ks9k" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.919642 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djpxs\" (UniqueName: \"kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:01 crc kubenswrapper[4894]: I0613 06:04:01.920092 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: I0613 06:04:02.022497 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djpxs\" (UniqueName: \"kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: I0613 06:04:02.022837 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: I0613 06:04:02.023040 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: I0613 06:04:02.409940 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djpxs\" (UniqueName: \"kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs\") pod \"crc-debug-9ks9k\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: I0613 06:04:02.504003 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9ks9k" Jun 13 06:04:02 crc kubenswrapper[4894]: W0613 06:04:02.587170 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc425c3e2_b16f_4e0d_ab59_8e15d23c4ad3.slice/crio-afcb2cee2c2fc00e24ece09182f53b520cfc2fffb694c69bb54817d1d3e74c82 WatchSource:0}: Error finding container afcb2cee2c2fc00e24ece09182f53b520cfc2fffb694c69bb54817d1d3e74c82: Status 404 returned error can't find the container with id afcb2cee2c2fc00e24ece09182f53b520cfc2fffb694c69bb54817d1d3e74c82 Jun 13 06:04:03 crc kubenswrapper[4894]: I0613 06:04:03.266331 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-9ks9k" event={"ID":"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3","Type":"ContainerStarted","Data":"35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e"} Jun 13 06:04:03 crc kubenswrapper[4894]: I0613 06:04:03.266675 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-9ks9k" event={"ID":"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3","Type":"ContainerStarted","Data":"afcb2cee2c2fc00e24ece09182f53b520cfc2fffb694c69bb54817d1d3e74c82"} Jun 13 06:04:03 crc kubenswrapper[4894]: I0613 06:04:03.283387 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-9ks9k" podStartSLOduration=2.2833744510000002 podStartE2EDuration="2.283374451s" podCreationTimestamp="2025-06-13 06:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:04:03.28050416 +0000 UTC m=+4401.726751623" watchObservedRunningTime="2025-06-13 06:04:03.283374451 +0000 UTC m=+4401.729621914" Jun 13 06:04:06 crc kubenswrapper[4894]: I0613 06:04:06.276997 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:04:06 crc kubenswrapper[4894]: E0613 06:04:06.278635 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:09 crc kubenswrapper[4894]: I0613 06:04:09.610252 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5bb5c5d86b-bl6pt_9f363e21-6e30-43d0-a699-9f671b627544/barbican-api-log/0.log" Jun 13 06:04:09 crc kubenswrapper[4894]: I0613 06:04:09.627496 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5bb5c5d86b-bl6pt_9f363e21-6e30-43d0-a699-9f671b627544/barbican-api/0.log" Jun 13 06:04:09 crc kubenswrapper[4894]: I0613 06:04:09.898419 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-688dbd77d4-tjxfc_26738de9-74f2-430a-a82d-ff0d7ec8e28f/barbican-keystone-listener-log/0.log" Jun 13 06:04:09 crc kubenswrapper[4894]: I0613 06:04:09.901016 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-688dbd77d4-tjxfc_26738de9-74f2-430a-a82d-ff0d7ec8e28f/barbican-keystone-listener/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.063830 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6688589669-t4pqd_baefe546-6c9e-41e3-a02c-2a5123bea0aa/barbican-worker-log/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.095895 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6688589669-t4pqd_baefe546-6c9e-41e3-a02c-2a5123bea0aa/barbican-worker/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.310988 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ssqvq_c1350090-3cce-492c-a445-b80a7ac7afbc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.492486 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ee84851-1e35-483d-9c39-a8f8a0de6f30/ceilometer-central-agent/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.549116 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ee84851-1e35-483d-9c39-a8f8a0de6f30/ceilometer-notification-agent/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.600233 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ee84851-1e35-483d-9c39-a8f8a0de6f30/proxy-httpd/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.752246 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ee84851-1e35-483d-9c39-a8f8a0de6f30/sg-core/0.log" Jun 13 06:04:10 crc kubenswrapper[4894]: I0613 06:04:10.788690 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-rkcnd_8a854efd-8626-4a9f-beec-678b2916fb09/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.033668 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-djz58_1732efba-04bc-4c2f-9966-7e1ff39add5c/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.244402 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2bea4009-3767-4a17-8687-18008e27effd/cinder-api-log/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.317247 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2bea4009-3767-4a17-8687-18008e27effd/cinder-api/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.467895 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_768fd773-29d0-4a76-9b25-aa40764378a0/cinder-backup/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.505451 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_768fd773-29d0-4a76-9b25-aa40764378a0/probe/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.710210 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4f9c3009-19d2-4508-bf6d-11881d3f028a/probe/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.728403 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4f9c3009-19d2-4508-bf6d-11881d3f028a/cinder-scheduler/0.log" Jun 13 06:04:11 crc kubenswrapper[4894]: I0613 06:04:11.848188 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_285093b2-d93f-4e96-86e2-66bfe23a93e2/cinder-volume/0.log" Jun 13 06:04:12 crc kubenswrapper[4894]: I0613 06:04:12.561287 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_285093b2-d93f-4e96-86e2-66bfe23a93e2/probe/0.log" Jun 13 06:04:12 crc kubenswrapper[4894]: I0613 06:04:12.706897 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4jlrc_d54bac12-88af-439e-8aab-def55120ac8f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:12 crc kubenswrapper[4894]: I0613 06:04:12.797426 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n26xt_5936607e-9f35-4b2d-95c2-abfba163575e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:12 crc kubenswrapper[4894]: I0613 06:04:12.980216 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_crc-debug-9ks9k_c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3/container-00/0.log" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.038245 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-548948d657-2xbqw_612832fe-9d71-437a-af43-c8c06931a237/init/0.log" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.180230 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-9ks9k"] Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.180432 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-9ks9k" podUID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" containerName="container-00" containerID="cri-o://35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e" gracePeriod=2 Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.198666 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-9ks9k"] Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.228154 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-548948d657-2xbqw_612832fe-9d71-437a-af43-c8c06931a237/init/0.log" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.254046 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9ks9k" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.339503 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djpxs\" (UniqueName: \"kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs\") pod \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.339686 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host\") pod \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\" (UID: \"c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3\") " Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.340108 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host" (OuterVolumeSpecName: "host") pod "c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" (UID: "c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.347422 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs" (OuterVolumeSpecName: "kube-api-access-djpxs") pod "c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" (UID: "c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3"). InnerVolumeSpecName "kube-api-access-djpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.387037 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbf8a2c2-e1d7-4341-b167-9162312c2b97/glance-httpd/0.log" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.388395 4894 generic.go:334] "Generic (PLEG): container finished" podID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" containerID="35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e" exitCode=0 Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.388440 4894 scope.go:117] "RemoveContainer" containerID="35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.388584 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-9ks9k" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.409916 4894 scope.go:117] "RemoveContainer" containerID="35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.411387 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-548948d657-2xbqw_612832fe-9d71-437a-af43-c8c06931a237/dnsmasq-dns/0.log" Jun 13 06:04:13 crc kubenswrapper[4894]: E0613 06:04:13.413782 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e\": container with ID starting with 35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e not found: ID does not exist" containerID="35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.413841 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e"} err="failed to get container status \"35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e\": rpc error: code = NotFound desc = could not find container \"35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e\": container with ID starting with 35760a021b3df9afe127d029339682571cd6f1c0fa5905c53ae2a2af337f839e not found: ID does not exist" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.442174 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.442223 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djpxs\" (UniqueName: \"kubernetes.io/projected/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3-kube-api-access-djpxs\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:13 crc kubenswrapper[4894]: I0613 06:04:13.495905 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbf8a2c2-e1d7-4341-b167-9162312c2b97/glance-log/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.187326 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52594d15-d5e4-432c-8125-d9e5ed137ad3/glance-httpd/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.260754 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52594d15-d5e4-432c-8125-d9e5ed137ad3/glance-log/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.285575 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" path="/var/lib/kubelet/pods/c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3/volumes" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.330738 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c647g_130edae6-6f39-446a-8add-df5ce866f925/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.396882 4894 generic.go:334] "Generic (PLEG): container finished" podID="f3a3bd19-df18-4ddd-b547-eb5c5b552d33" containerID="b52d70a9957a518618e5da9f40f5f09559257cf24307cfe76632b31edd4bc4bd" exitCode=0 Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.396946 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" event={"ID":"f3a3bd19-df18-4ddd-b547-eb5c5b552d33","Type":"ContainerDied","Data":"b52d70a9957a518618e5da9f40f5f09559257cf24307cfe76632b31edd4bc4bd"} Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.482687 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-znc5q_0ac5d831-55f4-495d-95ca-af1497a809e6/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.700332 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29163241-m92x7_55ff65a8-628e-4770-b12b-f2d50722acd2/keystone-cron/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.722439 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6bfbcbb6c7-q4p46_0e483628-45ae-49b7-bb58-abfeda32d6c0/keystone-api/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.915858 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5c5acc39-066b-40b1-abcf-b5311aea15d9/kube-state-metrics/0.log" Jun 13 06:04:14 crc kubenswrapper[4894]: I0613 06:04:14.959233 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6q564_3602457e-fe9b-47ab-9497-b0777af3f090/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.123044 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_081e7394-2489-452b-a7b3-6c12b22200c8/manila-api-log/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.148046 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_081e7394-2489-452b-a7b3-6c12b22200c8/manila-api/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.272311 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1cfe9e73-4d63-499a-b177-6ab1cd56f443/manila-scheduler/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.402012 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_1cfe9e73-4d63-499a-b177-6ab1cd56f443/probe/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.485504 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0220044c-1440-4924-b660-c2babe3d6acc/manila-share/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.488555 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.525013 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-rr8p9"] Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.533564 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-rr8p9"] Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.563319 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0220044c-1440-4924-b660-c2babe3d6acc/probe/0.log" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.579492 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host\") pod \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.579646 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host" (OuterVolumeSpecName: "host") pod "f3a3bd19-df18-4ddd-b547-eb5c5b552d33" (UID: "f3a3bd19-df18-4ddd-b547-eb5c5b552d33"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.579717 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989t8\" (UniqueName: \"kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8\") pod \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\" (UID: \"f3a3bd19-df18-4ddd-b547-eb5c5b552d33\") " Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.580184 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.593156 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8" (OuterVolumeSpecName: "kube-api-access-989t8") pod "f3a3bd19-df18-4ddd-b547-eb5c5b552d33" (UID: "f3a3bd19-df18-4ddd-b547-eb5c5b552d33"). InnerVolumeSpecName "kube-api-access-989t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.681759 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989t8\" (UniqueName: \"kubernetes.io/projected/f3a3bd19-df18-4ddd-b547-eb5c5b552d33-kube-api-access-989t8\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:15 crc kubenswrapper[4894]: I0613 06:04:15.925777 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-554d559d55-rtnwg_9401dca8-385e-4849-abb9-38059dd2ae63/neutron-api/0.log" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.032587 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-554d559d55-rtnwg_9401dca8-385e-4849-abb9-38059dd2ae63/neutron-httpd/0.log" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.220881 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-27cdx_31bca8ac-baaf-4837-96fd-6f0e556e7c53/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.291317 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a3bd19-df18-4ddd-b547-eb5c5b552d33" path="/var/lib/kubelet/pods/f3a3bd19-df18-4ddd-b547-eb5c5b552d33/volumes" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.414462 4894 scope.go:117] "RemoveContainer" containerID="b52d70a9957a518618e5da9f40f5f09559257cf24307cfe76632b31edd4bc4bd" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.414595 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-rr8p9" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.776583 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-h25lx"] Jun 13 06:04:16 crc kubenswrapper[4894]: E0613 06:04:16.777484 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a3bd19-df18-4ddd-b547-eb5c5b552d33" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.777562 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a3bd19-df18-4ddd-b547-eb5c5b552d33" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: E0613 06:04:16.777633 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.777714 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.777963 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a3bd19-df18-4ddd-b547-eb5c5b552d33" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.778037 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="c425c3e2-b16f-4e0d-ab59-8e15d23c4ad3" containerName="container-00" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.778694 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.781990 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e7c817a0-08df-456b-a791-53dd7ad019b6/nova-api-log/0.log" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.899305 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnsk\" (UniqueName: \"kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.899618 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:16 crc kubenswrapper[4894]: I0613 06:04:16.973175 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_440e7806-d1b2-4fc0-9d66-11f4dfa5f4a5/nova-cell0-conductor-conductor/0.log" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.000951 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.001113 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnsk\" (UniqueName: \"kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.001408 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.015450 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e7c817a0-08df-456b-a791-53dd7ad019b6/nova-api-api/0.log" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.026067 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnsk\" (UniqueName: \"kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk\") pod \"crc-debug-h25lx\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.108231 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.423509 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" event={"ID":"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b","Type":"ContainerStarted","Data":"964a18d1a87d56bab6692dc6bb994d71a273398850a075da207b425bf8b35eb4"} Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.602338 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_088bef64-0dd7-48f3-9977-6fc21e24686a/nova-cell1-novncproxy-novncproxy/0.log" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.706200 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c7fe666a-475a-4f55-9edb-4f19b5b87f73/nova-cell1-conductor-conductor/0.log" Jun 13 06:04:17 crc kubenswrapper[4894]: I0613 06:04:17.998009 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7j9k8_17cfff45-e7b3-4297-9b08-33a8ea345bc2/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:18 crc kubenswrapper[4894]: I0613 06:04:18.264072 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8c7b253d-3144-4952-b97b-c65d19b4524a/nova-metadata-log/0.log" Jun 13 06:04:18 crc kubenswrapper[4894]: I0613 06:04:18.432446 4894 generic.go:334] "Generic (PLEG): container finished" podID="96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" containerID="509902f51ddd1928014d1b6ba445e840f458d367bdcce5565e2ac9e2aa86a3a4" exitCode=0 Jun 13 06:04:18 crc kubenswrapper[4894]: I0613 06:04:18.432486 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" event={"ID":"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b","Type":"ContainerDied","Data":"509902f51ddd1928014d1b6ba445e840f458d367bdcce5565e2ac9e2aa86a3a4"} Jun 13 06:04:18 crc kubenswrapper[4894]: I0613 06:04:18.704620 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ff418d80-000a-45be-932e-6b1705b9ab49/nova-scheduler-scheduler/0.log" Jun 13 06:04:18 crc kubenswrapper[4894]: I0613 06:04:18.715272 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5e584455-5537-425d-a454-063087cc3fea/mysql-bootstrap/0.log" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.054746 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5e584455-5537-425d-a454-063087cc3fea/galera/0.log" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.058969 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5e584455-5537-425d-a454-063087cc3fea/mysql-bootstrap/0.log" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.129502 4894 scope.go:117] "RemoveContainer" containerID="2cc0177cd087d1860336a079320e818fd16f6cc2526e66ea9a00e1394397240c" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.349874 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_138daa45-0563-4c44-8b99-9bfb66eea5c6/mysql-bootstrap/0.log" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.553557 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.596619 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnsk\" (UniqueName: \"kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk\") pod \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.597078 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host\") pod \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\" (UID: \"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b\") " Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.597201 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host" (OuterVolumeSpecName: "host") pod "96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" (UID: "96b18af7-e104-4c47-b5eb-9b0b56b9fe6b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.597850 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.615351 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk" (OuterVolumeSpecName: "kube-api-access-jjnsk") pod "96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" (UID: "96b18af7-e104-4c47-b5eb-9b0b56b9fe6b"). InnerVolumeSpecName "kube-api-access-jjnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.638344 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_138daa45-0563-4c44-8b99-9bfb66eea5c6/mysql-bootstrap/0.log" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.708390 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnsk\" (UniqueName: \"kubernetes.io/projected/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b-kube-api-access-jjnsk\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:19 crc kubenswrapper[4894]: I0613 06:04:19.862033 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_138daa45-0563-4c44-8b99-9bfb66eea5c6/galera/0.log" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.095636 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6fee8542-bed9-434a-86c2-709235db9cf0/openstackclient/0.log" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.277906 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:04:20 crc kubenswrapper[4894]: E0613 06:04:20.278210 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.335170 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8c7b253d-3144-4952-b97b-c65d19b4524a/nova-metadata-metadata/0.log" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.449794 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" event={"ID":"96b18af7-e104-4c47-b5eb-9b0b56b9fe6b","Type":"ContainerDied","Data":"964a18d1a87d56bab6692dc6bb994d71a273398850a075da207b425bf8b35eb4"} Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.449871 4894 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="964a18d1a87d56bab6692dc6bb994d71a273398850a075da207b425bf8b35eb4" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.449952 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-h25lx" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.493697 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5zm5p_e715a67e-623b-4d05-8bc9-676747d445fb/ovn-controller/0.log" Jun 13 06:04:20 crc kubenswrapper[4894]: I0613 06:04:20.592490 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vb6p4_138962ec-89d1-4771-adad-e9a0d910e80b/ovsdb-server-init/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.226081 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vb6p4_138962ec-89d1-4771-adad-e9a0d910e80b/ovsdb-server/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.239774 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vb6p4_138962ec-89d1-4771-adad-e9a0d910e80b/ovsdb-server-init/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.298861 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vb6p4_138962ec-89d1-4771-adad-e9a0d910e80b/ovs-vswitchd/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.698779 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bzzpl_32106b20-1e0f-4cd7-bbd8-b092163a9035/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.914492 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-774dc9f9f-dcqbb_71f90c09-b2cc-4e1c-a18f-595a5efeb141/ovn-northd/0.log" Jun 13 06:04:21 crc kubenswrapper[4894]: I0613 06:04:21.963558 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e3d6bb95-a363-4ac9-8034-dff9e9642464/ovsdbserver-nb/0.log" Jun 13 06:04:22 crc kubenswrapper[4894]: I0613 06:04:22.300099 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6d3ebc6-d7cd-4b8d-bc3f-70f6e8819a32/ovsdbserver-sb/0.log" Jun 13 06:04:22 crc kubenswrapper[4894]: I0613 06:04:22.711434 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56495d6d8b-4hpz7_8e69886e-0c5c-4f2b-b479-fd600873129b/placement-api/0.log" Jun 13 06:04:22 crc kubenswrapper[4894]: I0613 06:04:22.920774 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56495d6d8b-4hpz7_8e69886e-0c5c-4f2b-b479-fd600873129b/placement-log/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.008054 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c2d6cfd6-2bbf-4bcc-a837-28ab5958af73/setup-container/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.314271 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c2d6cfd6-2bbf-4bcc-a837-28ab5958af73/setup-container/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.456734 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c2d6cfd6-2bbf-4bcc-a837-28ab5958af73/rabbitmq/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.727569 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c5edbb43-096d-46e3-9e13-827e7eb51868/setup-container/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.937638 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c5edbb43-096d-46e3-9e13-827e7eb51868/setup-container/0.log" Jun 13 06:04:23 crc kubenswrapper[4894]: I0613 06:04:23.941382 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c5edbb43-096d-46e3-9e13-827e7eb51868/rabbitmq/0.log" Jun 13 06:04:24 crc kubenswrapper[4894]: I0613 06:04:24.195035 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-w8k8g_40081dfe-b604-4593-a435-3310168c3c31/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:24 crc kubenswrapper[4894]: I0613 06:04:24.363062 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j8hsc_c30d60b6-5327-4d21-b668-a5aa64265c8c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:24 crc kubenswrapper[4894]: I0613 06:04:24.798298 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-v454f_8f396ee0-4caf-4c2c-a060-e46767e338c9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:24 crc kubenswrapper[4894]: I0613 06:04:24.903953 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-nbbfz_f58c6e89-c103-43f0-b2a7-fcb21a7f677c/ssh-known-hosts-edpm-deployment/0.log" Jun 13 06:04:25 crc kubenswrapper[4894]: I0613 06:04:25.259552 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_357d4a2c-de1e-47c2-8602-9b717b898330/tempest-tests-tempest-tests-runner/0.log" Jun 13 06:04:25 crc kubenswrapper[4894]: I0613 06:04:25.367651 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9a0f3544-2f43-428d-ab9a-04afce67765a/test-operator-logs-container/0.log" Jun 13 06:04:25 crc kubenswrapper[4894]: I0613 06:04:25.517597 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lr9vm_f45b6e84-264a-43a0-8103-86b94fbbc5a5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jun 13 06:04:27 crc kubenswrapper[4894]: I0613 06:04:27.469596 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-h25lx"] Jun 13 06:04:27 crc kubenswrapper[4894]: I0613 06:04:27.478778 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-h25lx"] Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.287324 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" path="/var/lib/kubelet/pods/96b18af7-e104-4c47-b5eb-9b0b56b9fe6b/volumes" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.708501 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-sgvt7"] Jun 13 06:04:28 crc kubenswrapper[4894]: E0613 06:04:28.708879 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" containerName="container-00" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.708891 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" containerName="container-00" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.709085 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b18af7-e104-4c47-b5eb-9b0b56b9fe6b" containerName="container-00" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.709655 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.803007 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.803166 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.904304 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.904612 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.904765 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:28 crc kubenswrapper[4894]: I0613 06:04:28.933179 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz\") pod \"crc-debug-sgvt7\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.027939 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.508792 4894 generic.go:334] "Generic (PLEG): container finished" podID="ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" containerID="22c54722a813c813f2533ef31a8dce78be140b3308f6e074d6126082e5094c54" exitCode=0 Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.508872 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" event={"ID":"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13","Type":"ContainerDied","Data":"22c54722a813c813f2533ef31a8dce78be140b3308f6e074d6126082e5094c54"} Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.509055 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" event={"ID":"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13","Type":"ContainerStarted","Data":"c2e20407d203a25aac17545863015c20a26c513e93ac681e6af109131ecf3f8f"} Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.554605 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-sgvt7"] Jun 13 06:04:29 crc kubenswrapper[4894]: I0613 06:04:29.565228 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vp6sz/crc-debug-sgvt7"] Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.632442 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.833093 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz\") pod \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.833337 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host\") pod \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\" (UID: \"ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13\") " Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.833384 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host" (OuterVolumeSpecName: "host") pod "ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" (UID: "ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.833731 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.839516 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz" (OuterVolumeSpecName: "kube-api-access-2xbcz") pod "ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" (UID: "ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13"). InnerVolumeSpecName "kube-api-access-2xbcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:04:30 crc kubenswrapper[4894]: I0613 06:04:30.936131 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13-kube-api-access-2xbcz\") on node \"crc\" DevicePath \"\"" Jun 13 06:04:31 crc kubenswrapper[4894]: I0613 06:04:31.523236 4894 scope.go:117] "RemoveContainer" containerID="22c54722a813c813f2533ef31a8dce78be140b3308f6e074d6126082e5094c54" Jun 13 06:04:31 crc kubenswrapper[4894]: I0613 06:04:31.523606 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/crc-debug-sgvt7" Jun 13 06:04:32 crc kubenswrapper[4894]: I0613 06:04:32.296256 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" path="/var/lib/kubelet/pods/ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13/volumes" Jun 13 06:04:35 crc kubenswrapper[4894]: I0613 06:04:35.276227 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:04:35 crc kubenswrapper[4894]: E0613 06:04:35.277682 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:37 crc kubenswrapper[4894]: I0613 06:04:37.842289 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1bb10c27-9b94-43cd-82df-407e68605449/memcached/0.log" Jun 13 06:04:47 crc kubenswrapper[4894]: I0613 06:04:47.278018 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:04:47 crc kubenswrapper[4894]: E0613 06:04:47.278676 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.240588 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/util/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.496353 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/util/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.506271 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/pull/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.529147 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/pull/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.658340 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/util/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.686819 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/pull/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.689291 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3652a20a1d637055331de6336b5bcb34cd5bf92ee0051abd135af36dcflgfw4_318fa333-aef2-42ac-bd8c-6232198b5093/extract/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.845585 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-9889b4756-lsslv_784a682d-1749-4399-a1f4-1e8bee7968ce/kube-rbac-proxy/0.log" Jun 13 06:04:55 crc kubenswrapper[4894]: I0613 06:04:55.957429 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-9889b4756-lsslv_784a682d-1749-4399-a1f4-1e8bee7968ce/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.030007 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57f4dc9749-rf6b7_250d2934-5f6e-4d4f-96d9-ec258c71909e/kube-rbac-proxy/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.095584 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57f4dc9749-rf6b7_250d2934-5f6e-4d4f-96d9-ec258c71909e/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.165915 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b554678df-6trss_6e780a91-140a-4b7b-9748-c3a6c3b954e1/kube-rbac-proxy/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.242452 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b554678df-6trss_6e780a91-140a-4b7b-9748-c3a6c3b954e1/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.395588 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-97b97479c-jw8m6_7d3873c8-7bab-42f8-918a-344d87eacce9/kube-rbac-proxy/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.490744 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-97b97479c-jw8m6_7d3873c8-7bab-42f8-918a-344d87eacce9/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.547543 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5486f4b54f-xdn4k_ea63dc95-4a48-4ed4-b990-c6990bbe3d33/kube-rbac-proxy/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.649940 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5486f4b54f-xdn4k_ea63dc95-4a48-4ed4-b990-c6990bbe3d33/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.806849 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7777cf768b-bm84t_297946dc-5d6d-4389-bff3-3044865254ef/kube-rbac-proxy/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.813907 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7777cf768b-bm84t_297946dc-5d6d-4389-bff3-3044865254ef/manager/0.log" Jun 13 06:04:56 crc kubenswrapper[4894]: I0613 06:04:56.912385 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5b4ccb8c4-2mcf5_5c09333e-da20-4f48-96b9-29021e93149b/kube-rbac-proxy/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.123711 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5b4ccb8c4-2mcf5_5c09333e-da20-4f48-96b9-29021e93149b/manager/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.533344 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-68f4bbb747-nfmz2_4c379ff5-1113-4698-a898-9c1cb29000cf/kube-rbac-proxy/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.606031 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-68f4bbb747-nfmz2_4c379ff5-1113-4698-a898-9c1cb29000cf/manager/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.621493 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5ccbd96f89-hrh2h_43453734-49dd-48b0-86b4-46b20966f2f5/kube-rbac-proxy/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.822501 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5ccbd96f89-hrh2h_43453734-49dd-48b0-86b4-46b20966f2f5/manager/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.832527 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-75b8755b74-q5plz_1295c691-04d0-4e6e-a4e6-4f85c6715964/kube-rbac-proxy/0.log" Jun 13 06:04:57 crc kubenswrapper[4894]: I0613 06:04:57.901463 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-75b8755b74-q5plz_1295c691-04d0-4e6e-a4e6-4f85c6715964/manager/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.010777 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7d4bbc7f54-r57lj_6f399f41-0f28-471b-be85-3468ff990e9d/kube-rbac-proxy/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.038955 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7d4bbc7f54-r57lj_6f399f41-0f28-471b-be85-3468ff990e9d/manager/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.114541 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5df6744645-ll2wl_c185ce61-38da-4eec-ab4d-4e73fbd9a957/kube-rbac-proxy/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.234737 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5df6744645-ll2wl_c185ce61-38da-4eec-ab4d-4e73fbd9a957/manager/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.308392 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-664db87fd8-m64zp_99063b46-9295-41e6-8ad6-5e6cefce2931/kube-rbac-proxy/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.364415 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-664db87fd8-m64zp_99063b46-9295-41e6-8ad6-5e6cefce2931/manager/0.log" Jun 13 06:04:58 crc kubenswrapper[4894]: I0613 06:04:58.446423 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-857f9d6b88-pt7m6_4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0/kube-rbac-proxy/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.145884 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt_bad848ff-73e6-4dad-a141-feac145e5c38/kube-rbac-proxy/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.149451 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-857f9d6b88-pt7m6_4cbe8d8f-e512-4ff2-8128-f9cbe4e070a0/manager/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.225531 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7dfb6cb54-6j8qt_bad848ff-73e6-4dad-a141-feac145e5c38/manager/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.277525 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:04:59 crc kubenswrapper[4894]: E0613 06:04:59.278076 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.366012 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74d9b8b9f5-cj7hp_2b3aee0a-6aa2-494d-8eae-5f97d5954868/kube-rbac-proxy/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.423215 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d66c4c8c7-6cm6g_712d94c9-1d98-4ff5-8af0-d15cab94e874/kube-rbac-proxy/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.632380 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c727t_7f203b92-325f-4c66-9c50-6269d6f628cf/registry-server/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.683878 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d66c4c8c7-6cm6g_712d94c9-1d98-4ff5-8af0-d15cab94e874/operator/0.log" Jun 13 06:04:59 crc kubenswrapper[4894]: I0613 06:04:59.947218 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9f78645d5-s9r55_6d32b2ff-59b7-4326-94e9-69e0fbd6ce34/kube-rbac-proxy/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.024336 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9f78645d5-s9r55_6d32b2ff-59b7-4326-94e9-69e0fbd6ce34/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.067014 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-58f798889d-2n26t_292d16cc-5623-4aa8-a644-2e69a901ca6f/kube-rbac-proxy/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.343049 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-58f798889d-2n26t_292d16cc-5623-4aa8-a644-2e69a901ca6f/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.378902 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74d9b8b9f5-cj7hp_2b3aee0a-6aa2-494d-8eae-5f97d5954868/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.436467 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-67ff8584d-fzgb7_4f86499e-2447-4489-89d5-1777e4d445c6/operator/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.578359 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7779c57cf7-7zldr_898d7bc9-6d9c-4e81-b72e-fdb6f7440b43/kube-rbac-proxy/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.662482 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7779c57cf7-7zldr_898d7bc9-6d9c-4e81-b72e-fdb6f7440b43/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.709148 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-884d667-sk2l9_de71738a-f07f-49c4-9820-1480db37be05/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.714841 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-884d667-sk2l9_de71738a-f07f-49c4-9820-1480db37be05/kube-rbac-proxy/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.910040 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6db7bffb67-rnhvc_29f79bd1-5c08-4435-8024-0a136c6b9337/manager/0.log" Jun 13 06:05:00 crc kubenswrapper[4894]: I0613 06:05:00.951713 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6db7bffb67-rnhvc_29f79bd1-5c08-4435-8024-0a136c6b9337/kube-rbac-proxy/0.log" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.634840 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-mnfnr"] Jun 13 06:05:01 crc kubenswrapper[4894]: E0613 06:05:01.635442 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" containerName="container-00" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.635453 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" containerName="container-00" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.635677 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae48c3b9-a6a9-4b3c-b661-95eba4f5fd13" containerName="container-00" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.636258 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.738306 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.738438 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bdx6\" (UniqueName: \"kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.839709 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdx6\" (UniqueName: \"kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.839877 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.839825 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.856828 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdx6\" (UniqueName: \"kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6\") pod \"crc-debug-mnfnr\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " pod="openstack/crc-debug-mnfnr" Jun 13 06:05:01 crc kubenswrapper[4894]: I0613 06:05:01.950863 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mnfnr" Jun 13 06:05:02 crc kubenswrapper[4894]: I0613 06:05:02.794527 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mnfnr" event={"ID":"ac037f68-fb31-428d-aca2-b4f9d326a5bb","Type":"ContainerStarted","Data":"78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675"} Jun 13 06:05:02 crc kubenswrapper[4894]: I0613 06:05:02.794918 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-mnfnr" event={"ID":"ac037f68-fb31-428d-aca2-b4f9d326a5bb","Type":"ContainerStarted","Data":"8f8b6071e9840b2c4b3f05200b40cada1de94010bd9162b324175e8729e34e1a"} Jun 13 06:05:02 crc kubenswrapper[4894]: I0613 06:05:02.814108 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-mnfnr" podStartSLOduration=1.814093103 podStartE2EDuration="1.814093103s" podCreationTimestamp="2025-06-13 06:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:05:02.808752622 +0000 UTC m=+4461.255000095" watchObservedRunningTime="2025-06-13 06:05:02.814093103 +0000 UTC m=+4461.260340566" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.569575 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-mnfnr"] Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.570324 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-mnfnr" podUID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" containerName="container-00" containerID="cri-o://78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675" gracePeriod=2 Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.585787 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-mnfnr"] Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.677176 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mnfnr" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.750060 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bdx6\" (UniqueName: \"kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6\") pod \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.750380 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host\") pod \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\" (UID: \"ac037f68-fb31-428d-aca2-b4f9d326a5bb\") " Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.750916 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host" (OuterVolumeSpecName: "host") pod "ac037f68-fb31-428d-aca2-b4f9d326a5bb" (UID: "ac037f68-fb31-428d-aca2-b4f9d326a5bb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.758307 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6" (OuterVolumeSpecName: "kube-api-access-7bdx6") pod "ac037f68-fb31-428d-aca2-b4f9d326a5bb" (UID: "ac037f68-fb31-428d-aca2-b4f9d326a5bb"). InnerVolumeSpecName "kube-api-access-7bdx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.853096 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bdx6\" (UniqueName: \"kubernetes.io/projected/ac037f68-fb31-428d-aca2-b4f9d326a5bb-kube-api-access-7bdx6\") on node \"crc\" DevicePath \"\"" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.853154 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac037f68-fb31-428d-aca2-b4f9d326a5bb-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.876746 4894 generic.go:334] "Generic (PLEG): container finished" podID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" containerID="78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675" exitCode=0 Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.876826 4894 scope.go:117] "RemoveContainer" containerID="78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.876884 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-mnfnr" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.913834 4894 scope.go:117] "RemoveContainer" containerID="78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675" Jun 13 06:05:12 crc kubenswrapper[4894]: E0613 06:05:12.914315 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675\": container with ID starting with 78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675 not found: ID does not exist" containerID="78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675" Jun 13 06:05:12 crc kubenswrapper[4894]: I0613 06:05:12.914357 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675"} err="failed to get container status \"78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675\": rpc error: code = NotFound desc = could not find container \"78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675\": container with ID starting with 78ec60607ab911b852c482deacb01bb03bd559b3252fe416dc9b67cd49924675 not found: ID does not exist" Jun 13 06:05:13 crc kubenswrapper[4894]: I0613 06:05:13.277467 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:05:13 crc kubenswrapper[4894]: E0613 06:05:13.278127 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:05:14 crc kubenswrapper[4894]: I0613 06:05:14.294527 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" path="/var/lib/kubelet/pods/ac037f68-fb31-428d-aca2-b4f9d326a5bb/volumes" Jun 13 06:05:20 crc kubenswrapper[4894]: I0613 06:05:20.001313 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5v28j_63289114-7b7f-45b9-85ad-2b265f69bdee/control-plane-machine-set-operator/0.log" Jun 13 06:05:20 crc kubenswrapper[4894]: I0613 06:05:20.161743 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-chjqk_b6c8af75-ffb5-4e91-9d8c-751ba03f67ba/machine-api-operator/0.log" Jun 13 06:05:20 crc kubenswrapper[4894]: I0613 06:05:20.175384 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-chjqk_b6c8af75-ffb5-4e91-9d8c-751ba03f67ba/kube-rbac-proxy/0.log" Jun 13 06:05:28 crc kubenswrapper[4894]: I0613 06:05:28.277542 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:05:28 crc kubenswrapper[4894]: E0613 06:05:28.278624 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:05:35 crc kubenswrapper[4894]: I0613 06:05:35.320832 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vzsr6_e2bb6864-165e-42ff-b86e-19129ead9f47/cert-manager-controller/0.log" Jun 13 06:05:35 crc kubenswrapper[4894]: I0613 06:05:35.411813 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wj9fc_d817e341-4899-4993-b77d-827f73433f02/cert-manager-cainjector/0.log" Jun 13 06:05:35 crc kubenswrapper[4894]: I0613 06:05:35.500779 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-qh9kw_8722649b-9056-433e-a648-fb9e76f9e2e1/cert-manager-webhook/0.log" Jun 13 06:05:39 crc kubenswrapper[4894]: I0613 06:05:39.277004 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:05:39 crc kubenswrapper[4894]: E0613 06:05:39.277635 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:05:50 crc kubenswrapper[4894]: I0613 06:05:50.276272 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:05:50 crc kubenswrapper[4894]: E0613 06:05:50.276857 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:05:51 crc kubenswrapper[4894]: I0613 06:05:51.510075 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-67b45cfc7d-k6jsz_52d99494-f908-4ff0-95b0-48261d144df9/nmstate-console-plugin/0.log" Jun 13 06:05:51 crc kubenswrapper[4894]: I0613 06:05:51.724084 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pmtdt_fe26ab59-9bfe-4a09-af5f-ce93f75ae3e8/nmstate-handler/0.log" Jun 13 06:05:51 crc kubenswrapper[4894]: I0613 06:05:51.828260 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-748555f888-pnvfx_b08ebefd-4622-4b62-8b92-63c658947cb1/nmstate-metrics/0.log" Jun 13 06:05:51 crc kubenswrapper[4894]: I0613 06:05:51.874874 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-748555f888-pnvfx_b08ebefd-4622-4b62-8b92-63c658947cb1/kube-rbac-proxy/0.log" Jun 13 06:05:51 crc kubenswrapper[4894]: I0613 06:05:51.996459 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d8f945fdc-jnpn4_e4ea5c7c-a747-4a1e-82ba-8b81e3c0a4c0/nmstate-operator/0.log" Jun 13 06:05:52 crc kubenswrapper[4894]: I0613 06:05:52.103851 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-79c49d6bf4-jztch_d541bc7b-8bed-4996-b3e3-e851b13f6fc4/nmstate-webhook/0.log" Jun 13 06:06:01 crc kubenswrapper[4894]: I0613 06:06:01.311602 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:06:01 crc kubenswrapper[4894]: E0613 06:06:01.312349 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.015843 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-75zf5"] Jun 13 06:06:02 crc kubenswrapper[4894]: E0613 06:06:02.016713 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" containerName="container-00" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.016745 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" containerName="container-00" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.017087 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac037f68-fb31-428d-aca2-b4f9d326a5bb" containerName="container-00" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.017822 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.150952 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzxv\" (UniqueName: \"kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.151164 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.253059 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzxv\" (UniqueName: \"kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.253228 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.253321 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.273990 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzxv\" (UniqueName: \"kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv\") pod \"crc-debug-75zf5\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " pod="openstack/crc-debug-75zf5" Jun 13 06:06:02 crc kubenswrapper[4894]: I0613 06:06:02.353487 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-75zf5" Jun 13 06:06:03 crc kubenswrapper[4894]: I0613 06:06:03.342263 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-75zf5" event={"ID":"cfe13a8b-b79e-4787-9bd0-a887f1898204","Type":"ContainerStarted","Data":"7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37"} Jun 13 06:06:03 crc kubenswrapper[4894]: I0613 06:06:03.342317 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-75zf5" event={"ID":"cfe13a8b-b79e-4787-9bd0-a887f1898204","Type":"ContainerStarted","Data":"7322e35bece9251f129b2445b5804a1446a59d29c27815d413c05349df908aff"} Jun 13 06:06:03 crc kubenswrapper[4894]: I0613 06:06:03.363246 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-75zf5" podStartSLOduration=2.363229215 podStartE2EDuration="2.363229215s" podCreationTimestamp="2025-06-13 06:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:06:03.353826079 +0000 UTC m=+4521.800073572" watchObservedRunningTime="2025-06-13 06:06:03.363229215 +0000 UTC m=+4521.809476678" Jun 13 06:06:09 crc kubenswrapper[4894]: I0613 06:06:09.303741 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5f968f88cc-wrmdl_7138d1ac-8b33-4219-b8fd-303eae7e0334/kube-rbac-proxy/0.log" Jun 13 06:06:09 crc kubenswrapper[4894]: I0613 06:06:09.381157 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5f968f88cc-wrmdl_7138d1ac-8b33-4219-b8fd-303eae7e0334/controller/0.log" Jun 13 06:06:09 crc kubenswrapper[4894]: I0613 06:06:09.928162 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-frr-files/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.142540 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-metrics/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.156421 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-frr-files/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.220030 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-reloader/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.248609 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-reloader/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.478632 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-frr-files/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.535589 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-reloader/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.542430 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-metrics/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.566437 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-metrics/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.875051 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-frr-files/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.881172 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-metrics/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.912986 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/controller/0.log" Jun 13 06:06:10 crc kubenswrapper[4894]: I0613 06:06:10.921354 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/cp-reloader/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.107435 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/frr-metrics/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.160311 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/kube-rbac-proxy/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.235117 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/kube-rbac-proxy-frr/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.505891 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/reloader/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.543284 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-8457d999f9-dn8gv_3b112a53-8bb0-4587-9f69-debaf87494c9/frr-k8s-webhook-server/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.832144 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d4cd7966b-4t7qq_d81de5ae-61d6-4f4c-b9b2-03dd880f3465/webhook-server/0.log" Jun 13 06:06:11 crc kubenswrapper[4894]: I0613 06:06:11.854882 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-856f595c5f-qqwj8_6b796cfd-5cf3-47be-ae8e-f3d77fc7917d/manager/0.log" Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.080534 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5z464_8d6bd32d-109c-4f18-a8de-057098dae117/kube-rbac-proxy/0.log" Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.168590 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kwsg5_ce477a54-7315-4d18-9fc3-af3bd9216888/frr/0.log" Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.292017 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:06:12 crc kubenswrapper[4894]: E0613 06:06:12.292226 4894 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t6vz8_openshift-machine-config-operator(192fcf92-25d2-4664-bb9d-8857929dd084)\"" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.498365 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5z464_8d6bd32d-109c-4f18-a8de-057098dae117/speaker/0.log" Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.881122 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-75zf5"] Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.881518 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-75zf5" podUID="cfe13a8b-b79e-4787-9bd0-a887f1898204" containerName="container-00" containerID="cri-o://7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37" gracePeriod=2 Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.903369 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-75zf5"] Jun 13 06:06:12 crc kubenswrapper[4894]: I0613 06:06:12.966862 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-75zf5" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.002510 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host\") pod \"cfe13a8b-b79e-4787-9bd0-a887f1898204\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.002764 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jzxv\" (UniqueName: \"kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv\") pod \"cfe13a8b-b79e-4787-9bd0-a887f1898204\" (UID: \"cfe13a8b-b79e-4787-9bd0-a887f1898204\") " Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.002883 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host" (OuterVolumeSpecName: "host") pod "cfe13a8b-b79e-4787-9bd0-a887f1898204" (UID: "cfe13a8b-b79e-4787-9bd0-a887f1898204"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.003155 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfe13a8b-b79e-4787-9bd0-a887f1898204-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.009815 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv" (OuterVolumeSpecName: "kube-api-access-7jzxv") pod "cfe13a8b-b79e-4787-9bd0-a887f1898204" (UID: "cfe13a8b-b79e-4787-9bd0-a887f1898204"). InnerVolumeSpecName "kube-api-access-7jzxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.104346 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jzxv\" (UniqueName: \"kubernetes.io/projected/cfe13a8b-b79e-4787-9bd0-a887f1898204-kube-api-access-7jzxv\") on node \"crc\" DevicePath \"\"" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.423815 4894 generic.go:334] "Generic (PLEG): container finished" podID="cfe13a8b-b79e-4787-9bd0-a887f1898204" containerID="7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37" exitCode=0 Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.423867 4894 scope.go:117] "RemoveContainer" containerID="7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.423879 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-75zf5" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.443003 4894 scope.go:117] "RemoveContainer" containerID="7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37" Jun 13 06:06:13 crc kubenswrapper[4894]: E0613 06:06:13.443300 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37\": container with ID starting with 7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37 not found: ID does not exist" containerID="7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37" Jun 13 06:06:13 crc kubenswrapper[4894]: I0613 06:06:13.443338 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37"} err="failed to get container status \"7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37\": rpc error: code = NotFound desc = could not find container \"7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37\": container with ID starting with 7ee1dc2cc83d45723cfdaba6748b42d6b52629399539b1cbc162787cfdff6e37 not found: ID does not exist" Jun 13 06:06:14 crc kubenswrapper[4894]: I0613 06:06:14.296711 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe13a8b-b79e-4787-9bd0-a887f1898204" path="/var/lib/kubelet/pods/cfe13a8b-b79e-4787-9bd0-a887f1898204/volumes" Jun 13 06:06:27 crc kubenswrapper[4894]: I0613 06:06:27.278824 4894 scope.go:117] "RemoveContainer" containerID="65abe87c7c08e926327be8ff7ed15827b44c26be61d773ecfcdd460a34e02bc0" Jun 13 06:06:27 crc kubenswrapper[4894]: I0613 06:06:27.578916 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" event={"ID":"192fcf92-25d2-4664-bb9d-8857929dd084","Type":"ContainerStarted","Data":"c1f73ab8deb2d17216069135d505f04980ad4ae2f9807ae2c3f0b7f47cd5d8ad"} Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.159867 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/util/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.377910 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/util/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.400362 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/pull/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.455943 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/pull/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.694971 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/util/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.699574 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/extract/0.log" Jun 13 06:06:28 crc kubenswrapper[4894]: I0613 06:06:28.699972 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6589df99d30ac9cb6e2ff26885e3c29d10fbe97338967aa6e4a5a06c852t6kz_50205d6d-0191-47eb-bf4a-1e9157be8634/pull/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.002517 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/util/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.206205 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/pull/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.209267 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/util/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.241572 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/pull/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.379333 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/pull/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.404118 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/util/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.460884 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_cb0993eeb7e97796aae9794463bd8f8d6c157d6b22eddb36c16ab757d27dj7x_4b945725-0767-451f-9574-d95782ced9c9/extract/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.577887 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-utilities/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.782053 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-content/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.790305 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-utilities/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.792301 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-content/0.log" Jun 13 06:06:29 crc kubenswrapper[4894]: I0613 06:06:29.994679 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-utilities/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.030127 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/extract-content/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.249546 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-utilities/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.451095 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kgrkm_aa646541-7ac7-4734-a72b-f1ee54746c8b/registry-server/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.548723 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-utilities/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.557453 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-content/0.log" Jun 13 06:06:30 crc kubenswrapper[4894]: I0613 06:06:30.569622 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-content/0.log" Jun 13 06:06:31 crc kubenswrapper[4894]: I0613 06:06:31.684572 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-utilities/0.log" Jun 13 06:06:31 crc kubenswrapper[4894]: I0613 06:06:31.740132 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/extract-content/0.log" Jun 13 06:06:31 crc kubenswrapper[4894]: I0613 06:06:31.771201 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7cgms_b31c24cf-e9ca-4ae7-aba9-d151e098ae5c/marketplace-operator/0.log" Jun 13 06:06:31 crc kubenswrapper[4894]: I0613 06:06:31.928391 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-458gw_ede0a450-b219-4eeb-a434-bf3ee2f699de/registry-server/0.log" Jun 13 06:06:31 crc kubenswrapper[4894]: I0613 06:06:31.964001 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-utilities/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.169795 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-utilities/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.201080 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-content/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.246344 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-content/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.448392 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-content/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.492164 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/extract-utilities/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.557199 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b5kgb_45f0f708-2377-4d6f-915f-ae04b3ba4e4b/registry-server/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.597528 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-utilities/0.log" Jun 13 06:06:32 crc kubenswrapper[4894]: I0613 06:06:32.755378 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-utilities/0.log" Jun 13 06:06:33 crc kubenswrapper[4894]: I0613 06:06:33.428421 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-content/0.log" Jun 13 06:06:33 crc kubenswrapper[4894]: I0613 06:06:33.441827 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-content/0.log" Jun 13 06:06:33 crc kubenswrapper[4894]: I0613 06:06:33.604235 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-content/0.log" Jun 13 06:06:33 crc kubenswrapper[4894]: I0613 06:06:33.656626 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/extract-utilities/0.log" Jun 13 06:06:34 crc kubenswrapper[4894]: I0613 06:06:34.107346 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xltkn_28d29738-add5-4eae-882b-341e47914202/registry-server/0.log" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.337997 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-npbb2"] Jun 13 06:07:02 crc kubenswrapper[4894]: E0613 06:07:02.338769 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe13a8b-b79e-4787-9bd0-a887f1898204" containerName="container-00" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.338785 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe13a8b-b79e-4787-9bd0-a887f1898204" containerName="container-00" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.338967 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe13a8b-b79e-4787-9bd0-a887f1898204" containerName="container-00" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.339528 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.525601 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdj5k\" (UniqueName: \"kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.525642 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.627442 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdj5k\" (UniqueName: \"kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.627720 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.627810 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.654641 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdj5k\" (UniqueName: \"kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k\") pod \"crc-debug-npbb2\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.655110 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-npbb2" Jun 13 06:07:02 crc kubenswrapper[4894]: I0613 06:07:02.878990 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-npbb2" event={"ID":"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02","Type":"ContainerStarted","Data":"411d9bff49692df08ed4f7c6de8295849b674798bc236f1697a21dbe4f79b902"} Jun 13 06:07:03 crc kubenswrapper[4894]: I0613 06:07:03.887746 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-npbb2" event={"ID":"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02","Type":"ContainerStarted","Data":"3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e"} Jun 13 06:07:03 crc kubenswrapper[4894]: I0613 06:07:03.901316 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-npbb2" podStartSLOduration=1.9013002220000002 podStartE2EDuration="1.901300222s" podCreationTimestamp="2025-06-13 06:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:07:03.89767711 +0000 UTC m=+4582.343924573" watchObservedRunningTime="2025-06-13 06:07:03.901300222 +0000 UTC m=+4582.347547685" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.013302 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.015454 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.030234 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.128871 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9rkk\" (UniqueName: \"kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.129157 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.129195 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.231131 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9rkk\" (UniqueName: \"kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.231179 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.231214 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.231707 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.231745 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.617541 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9rkk\" (UniqueName: \"kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk\") pod \"certified-operators-mlw7l\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.626407 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.629709 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.630596 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.630941 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.744217 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.744317 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndk8q\" (UniqueName: \"kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.744346 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.846088 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.846438 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndk8q\" (UniqueName: \"kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.846464 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.846582 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.846955 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:08 crc kubenswrapper[4894]: I0613 06:07:08.866269 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndk8q\" (UniqueName: \"kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q\") pod \"community-operators-s2nwf\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.102639 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.210340 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:09 crc kubenswrapper[4894]: W0613 06:07:09.228184 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc26f8c_7bea_4e6e_8183_e6203ee87d60.slice/crio-184baccc1d965576b2dab85af8fe2d33888599fc93267f0013e2ba60c17c68f4 WatchSource:0}: Error finding container 184baccc1d965576b2dab85af8fe2d33888599fc93267f0013e2ba60c17c68f4: Status 404 returned error can't find the container with id 184baccc1d965576b2dab85af8fe2d33888599fc93267f0013e2ba60c17c68f4 Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.595646 4894 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.944622 4894 generic.go:334] "Generic (PLEG): container finished" podID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerID="edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897" exitCode=0 Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.944857 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerDied","Data":"edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897"} Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.944882 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerStarted","Data":"9d076d0f345224a46866684e0dae253a1c83f1ce88ac8161d188e3555a7498a4"} Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.946577 4894 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.951060 4894 generic.go:334] "Generic (PLEG): container finished" podID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerID="d435769e3eb31c9f7cfd61022c508b654be86ab4c40b27133b07a47d4ab22028" exitCode=0 Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.951121 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerDied","Data":"d435769e3eb31c9f7cfd61022c508b654be86ab4c40b27133b07a47d4ab22028"} Jun 13 06:07:09 crc kubenswrapper[4894]: I0613 06:07:09.951161 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerStarted","Data":"184baccc1d965576b2dab85af8fe2d33888599fc93267f0013e2ba60c17c68f4"} Jun 13 06:07:10 crc kubenswrapper[4894]: I0613 06:07:10.979137 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerStarted","Data":"6a854fb1f2d3b319710f2943a4eeece20eaebf83f4779e1cf344eed4a1fef4a9"} Jun 13 06:07:10 crc kubenswrapper[4894]: I0613 06:07:10.982623 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerStarted","Data":"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b"} Jun 13 06:07:12 crc kubenswrapper[4894]: I0613 06:07:12.004976 4894 generic.go:334] "Generic (PLEG): container finished" podID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerID="6a854fb1f2d3b319710f2943a4eeece20eaebf83f4779e1cf344eed4a1fef4a9" exitCode=0 Jun 13 06:07:12 crc kubenswrapper[4894]: I0613 06:07:12.006742 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerDied","Data":"6a854fb1f2d3b319710f2943a4eeece20eaebf83f4779e1cf344eed4a1fef4a9"} Jun 13 06:07:12 crc kubenswrapper[4894]: I0613 06:07:12.014975 4894 generic.go:334] "Generic (PLEG): container finished" podID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerID="85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b" exitCode=0 Jun 13 06:07:12 crc kubenswrapper[4894]: I0613 06:07:12.015058 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerDied","Data":"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b"} Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.024981 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerStarted","Data":"68dc582c08fd5bac5eebfb1ce322370b3afb751ac9cc293530f36d2e6e19694e"} Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.028221 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerStarted","Data":"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e"} Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.046265 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlw7l" podStartSLOduration=3.546923162 podStartE2EDuration="6.046249737s" podCreationTimestamp="2025-06-13 06:07:07 +0000 UTC" firstStartedPulling="2025-06-13 06:07:09.953076421 +0000 UTC m=+4588.399323884" lastFinishedPulling="2025-06-13 06:07:12.452402966 +0000 UTC m=+4590.898650459" observedRunningTime="2025-06-13 06:07:13.042798159 +0000 UTC m=+4591.489045622" watchObservedRunningTime="2025-06-13 06:07:13.046249737 +0000 UTC m=+4591.492497200" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.242152 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2nwf" podStartSLOduration=2.602909194 podStartE2EDuration="5.242131872s" podCreationTimestamp="2025-06-13 06:07:08 +0000 UTC" firstStartedPulling="2025-06-13 06:07:09.946302 +0000 UTC m=+4588.392549463" lastFinishedPulling="2025-06-13 06:07:12.585524668 +0000 UTC m=+4591.031772141" observedRunningTime="2025-06-13 06:07:13.068960459 +0000 UTC m=+4591.515207952" watchObservedRunningTime="2025-06-13 06:07:13.242131872 +0000 UTC m=+4591.688379345" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.254781 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-npbb2"] Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.255012 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-npbb2" podUID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" containerName="container-00" containerID="cri-o://3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e" gracePeriod=2 Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.269582 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-npbb2"] Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.357800 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-npbb2" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.456263 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host\") pod \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.456609 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host" (OuterVolumeSpecName: "host") pod "fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" (UID: "fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.458053 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdj5k\" (UniqueName: \"kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k\") pod \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\" (UID: \"fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02\") " Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.459304 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.471456 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k" (OuterVolumeSpecName: "kube-api-access-xdj5k") pod "fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" (UID: "fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02"). InnerVolumeSpecName "kube-api-access-xdj5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:07:13 crc kubenswrapper[4894]: I0613 06:07:13.561565 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdj5k\" (UniqueName: \"kubernetes.io/projected/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02-kube-api-access-xdj5k\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.037204 4894 generic.go:334] "Generic (PLEG): container finished" podID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" containerID="3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e" exitCode=0 Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.037271 4894 scope.go:117] "RemoveContainer" containerID="3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e" Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.037284 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-npbb2" Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.066442 4894 scope.go:117] "RemoveContainer" containerID="3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e" Jun 13 06:07:14 crc kubenswrapper[4894]: E0613 06:07:14.067154 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e\": container with ID starting with 3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e not found: ID does not exist" containerID="3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e" Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.067263 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e"} err="failed to get container status \"3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e\": rpc error: code = NotFound desc = could not find container \"3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e\": container with ID starting with 3e2f5ada6fd7da13e8540127e89aa23a27d40bada3f888f10001b18aef534d2e not found: ID does not exist" Jun 13 06:07:14 crc kubenswrapper[4894]: I0613 06:07:14.288722 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" path="/var/lib/kubelet/pods/fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02/volumes" Jun 13 06:07:18 crc kubenswrapper[4894]: I0613 06:07:18.631669 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:18 crc kubenswrapper[4894]: I0613 06:07:18.632417 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:18 crc kubenswrapper[4894]: I0613 06:07:18.715880 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:19 crc kubenswrapper[4894]: I0613 06:07:19.104882 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:19 crc kubenswrapper[4894]: I0613 06:07:19.104935 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:19 crc kubenswrapper[4894]: I0613 06:07:19.779758 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:19 crc kubenswrapper[4894]: I0613 06:07:19.856420 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:19 crc kubenswrapper[4894]: I0613 06:07:19.887265 4894 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:20 crc kubenswrapper[4894]: I0613 06:07:20.159001 4894 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:21 crc kubenswrapper[4894]: I0613 06:07:21.109448 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlw7l" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="registry-server" containerID="cri-o://68dc582c08fd5bac5eebfb1ce322370b3afb751ac9cc293530f36d2e6e19694e" gracePeriod=2 Jun 13 06:07:21 crc kubenswrapper[4894]: I0613 06:07:21.445242 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.122295 4894 generic.go:334] "Generic (PLEG): container finished" podID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerID="68dc582c08fd5bac5eebfb1ce322370b3afb751ac9cc293530f36d2e6e19694e" exitCode=0 Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.122489 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2nwf" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="registry-server" containerID="cri-o://e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e" gracePeriod=2 Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.122808 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerDied","Data":"68dc582c08fd5bac5eebfb1ce322370b3afb751ac9cc293530f36d2e6e19694e"} Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.456342 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.593895 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities\") pod \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.594031 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9rkk\" (UniqueName: \"kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk\") pod \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.594110 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content\") pod \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\" (UID: \"7fc26f8c-7bea-4e6e-8183-e6203ee87d60\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.594711 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities" (OuterVolumeSpecName: "utilities") pod "7fc26f8c-7bea-4e6e-8183-e6203ee87d60" (UID: "7fc26f8c-7bea-4e6e-8183-e6203ee87d60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.602225 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk" (OuterVolumeSpecName: "kube-api-access-j9rkk") pod "7fc26f8c-7bea-4e6e-8183-e6203ee87d60" (UID: "7fc26f8c-7bea-4e6e-8183-e6203ee87d60"). InnerVolumeSpecName "kube-api-access-j9rkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.652804 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.665505 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fc26f8c-7bea-4e6e-8183-e6203ee87d60" (UID: "7fc26f8c-7bea-4e6e-8183-e6203ee87d60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.697610 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9rkk\" (UniqueName: \"kubernetes.io/projected/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-kube-api-access-j9rkk\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.697801 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.697848 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc26f8c-7bea-4e6e-8183-e6203ee87d60-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.799893 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content\") pod \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.800013 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities\") pod \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.800089 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndk8q\" (UniqueName: \"kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q\") pod \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\" (UID: \"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579\") " Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.801112 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities" (OuterVolumeSpecName: "utilities") pod "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" (UID: "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.815935 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q" (OuterVolumeSpecName: "kube-api-access-ndk8q") pod "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" (UID: "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579"). InnerVolumeSpecName "kube-api-access-ndk8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.870193 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" (UID: "ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.903716 4894 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-catalog-content\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.903764 4894 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-utilities\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:22 crc kubenswrapper[4894]: I0613 06:07:22.903787 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndk8q\" (UniqueName: \"kubernetes.io/projected/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579-kube-api-access-ndk8q\") on node \"crc\" DevicePath \"\"" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.139456 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlw7l" event={"ID":"7fc26f8c-7bea-4e6e-8183-e6203ee87d60","Type":"ContainerDied","Data":"184baccc1d965576b2dab85af8fe2d33888599fc93267f0013e2ba60c17c68f4"} Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.139505 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlw7l" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.139525 4894 scope.go:117] "RemoveContainer" containerID="68dc582c08fd5bac5eebfb1ce322370b3afb751ac9cc293530f36d2e6e19694e" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.146738 4894 generic.go:334] "Generic (PLEG): container finished" podID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerID="e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e" exitCode=0 Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.146782 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerDied","Data":"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e"} Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.146813 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2nwf" event={"ID":"ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579","Type":"ContainerDied","Data":"9d076d0f345224a46866684e0dae253a1c83f1ce88ac8161d188e3555a7498a4"} Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.146897 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2nwf" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.172037 4894 scope.go:117] "RemoveContainer" containerID="6a854fb1f2d3b319710f2943a4eeece20eaebf83f4779e1cf344eed4a1fef4a9" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.190346 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.200254 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlw7l"] Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.221934 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.225907 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2nwf"] Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.229798 4894 scope.go:117] "RemoveContainer" containerID="d435769e3eb31c9f7cfd61022c508b654be86ab4c40b27133b07a47d4ab22028" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.260416 4894 scope.go:117] "RemoveContainer" containerID="e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.296058 4894 scope.go:117] "RemoveContainer" containerID="85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.320797 4894 scope.go:117] "RemoveContainer" containerID="edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.345830 4894 scope.go:117] "RemoveContainer" containerID="e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e" Jun 13 06:07:23 crc kubenswrapper[4894]: E0613 06:07:23.346489 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e\": container with ID starting with e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e not found: ID does not exist" containerID="e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.346516 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e"} err="failed to get container status \"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e\": rpc error: code = NotFound desc = could not find container \"e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e\": container with ID starting with e37e875cad019bf0c6b111b3981d29583d602675c718beddcad152cac8b60b0e not found: ID does not exist" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.346535 4894 scope.go:117] "RemoveContainer" containerID="85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b" Jun 13 06:07:23 crc kubenswrapper[4894]: E0613 06:07:23.347159 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b\": container with ID starting with 85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b not found: ID does not exist" containerID="85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.347178 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b"} err="failed to get container status \"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b\": rpc error: code = NotFound desc = could not find container \"85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b\": container with ID starting with 85fa86c3ef8ce9c71ff6adc22812ff3ad2864fbe9fc861de1944c73f52ee3b9b not found: ID does not exist" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.347191 4894 scope.go:117] "RemoveContainer" containerID="edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897" Jun 13 06:07:23 crc kubenswrapper[4894]: E0613 06:07:23.347495 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897\": container with ID starting with edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897 not found: ID does not exist" containerID="edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897" Jun 13 06:07:23 crc kubenswrapper[4894]: I0613 06:07:23.347515 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897"} err="failed to get container status \"edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897\": rpc error: code = NotFound desc = could not find container \"edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897\": container with ID starting with edb7b668bca7ec59015c30c375543ef1d97af178e3e4176b28828fd19a325897 not found: ID does not exist" Jun 13 06:07:24 crc kubenswrapper[4894]: I0613 06:07:24.291782 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" path="/var/lib/kubelet/pods/7fc26f8c-7bea-4e6e-8183-e6203ee87d60/volumes" Jun 13 06:07:24 crc kubenswrapper[4894]: I0613 06:07:24.292940 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" path="/var/lib/kubelet/pods/ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579/volumes" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.677842 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-kml56"] Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679470 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" containerName="container-00" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679505 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" containerName="container-00" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679587 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="extract-content" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679604 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="extract-content" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679628 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679641 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679717 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="extract-utilities" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679731 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="extract-utilities" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679753 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="extract-content" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679766 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="extract-content" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679794 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679807 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: E0613 06:08:01.679829 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="extract-utilities" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.679841 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="extract-utilities" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.680205 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8b1afa-9f5a-4dd7-9216-5a87f2cc0579" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.680244 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8e8e88-f2c1-4d5f-9bc8-c4e94d8e4d02" containerName="container-00" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.680274 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc26f8c-7bea-4e6e-8183-e6203ee87d60" containerName="registry-server" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.681265 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.802150 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.802565 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drq46\" (UniqueName: \"kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.904088 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drq46\" (UniqueName: \"kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.904170 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.904314 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:01 crc kubenswrapper[4894]: I0613 06:08:01.926172 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drq46\" (UniqueName: \"kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46\") pod \"crc-debug-kml56\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " pod="openstack/crc-debug-kml56" Jun 13 06:08:02 crc kubenswrapper[4894]: I0613 06:08:02.012377 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kml56" Jun 13 06:08:02 crc kubenswrapper[4894]: W0613 06:08:02.054451 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1955414a_5a93_4595_8a14_31f8bc8f7df5.slice/crio-f67f4b283941df408bef5a515af8255c6d82d1f6f1a39fe1d795c369a12b1785 WatchSource:0}: Error finding container f67f4b283941df408bef5a515af8255c6d82d1f6f1a39fe1d795c369a12b1785: Status 404 returned error can't find the container with id f67f4b283941df408bef5a515af8255c6d82d1f6f1a39fe1d795c369a12b1785 Jun 13 06:08:02 crc kubenswrapper[4894]: I0613 06:08:02.625718 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kml56" event={"ID":"1955414a-5a93-4595-8a14-31f8bc8f7df5","Type":"ContainerStarted","Data":"f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f"} Jun 13 06:08:02 crc kubenswrapper[4894]: I0613 06:08:02.625793 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-kml56" event={"ID":"1955414a-5a93-4595-8a14-31f8bc8f7df5","Type":"ContainerStarted","Data":"f67f4b283941df408bef5a515af8255c6d82d1f6f1a39fe1d795c369a12b1785"} Jun 13 06:08:02 crc kubenswrapper[4894]: I0613 06:08:02.662775 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-kml56" podStartSLOduration=1.662747215 podStartE2EDuration="1.662747215s" podCreationTimestamp="2025-06-13 06:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:08:02.652790414 +0000 UTC m=+4641.099037917" watchObservedRunningTime="2025-06-13 06:08:02.662747215 +0000 UTC m=+4641.108994708" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.652663 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-kml56"] Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.653759 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-kml56" podUID="1955414a-5a93-4595-8a14-31f8bc8f7df5" containerName="container-00" containerID="cri-o://f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f" gracePeriod=2 Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.663443 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-kml56"] Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.754524 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kml56" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.754635 4894 generic.go:334] "Generic (PLEG): container finished" podID="1955414a-5a93-4595-8a14-31f8bc8f7df5" containerID="f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f" exitCode=0 Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.754804 4894 scope.go:117] "RemoveContainer" containerID="f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.779213 4894 scope.go:117] "RemoveContainer" containerID="f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f" Jun 13 06:08:12 crc kubenswrapper[4894]: E0613 06:08:12.779868 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f\": container with ID starting with f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f not found: ID does not exist" containerID="f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.779910 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f"} err="failed to get container status \"f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f\": rpc error: code = NotFound desc = could not find container \"f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f\": container with ID starting with f09f26b02d94644d5b55e772eff71163171a872f93c6a309c5b871f2b187652f not found: ID does not exist" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.888042 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drq46\" (UniqueName: \"kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46\") pod \"1955414a-5a93-4595-8a14-31f8bc8f7df5\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.888117 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host\") pod \"1955414a-5a93-4595-8a14-31f8bc8f7df5\" (UID: \"1955414a-5a93-4595-8a14-31f8bc8f7df5\") " Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.888368 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host" (OuterVolumeSpecName: "host") pod "1955414a-5a93-4595-8a14-31f8bc8f7df5" (UID: "1955414a-5a93-4595-8a14-31f8bc8f7df5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.889148 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1955414a-5a93-4595-8a14-31f8bc8f7df5-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:08:12 crc kubenswrapper[4894]: I0613 06:08:12.893627 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46" (OuterVolumeSpecName: "kube-api-access-drq46") pod "1955414a-5a93-4595-8a14-31f8bc8f7df5" (UID: "1955414a-5a93-4595-8a14-31f8bc8f7df5"). InnerVolumeSpecName "kube-api-access-drq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:08:13 crc kubenswrapper[4894]: I0613 06:08:13.004858 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drq46\" (UniqueName: \"kubernetes.io/projected/1955414a-5a93-4595-8a14-31f8bc8f7df5-kube-api-access-drq46\") on node \"crc\" DevicePath \"\"" Jun 13 06:08:13 crc kubenswrapper[4894]: I0613 06:08:13.767793 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-kml56" Jun 13 06:08:14 crc kubenswrapper[4894]: I0613 06:08:14.294683 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1955414a-5a93-4595-8a14-31f8bc8f7df5" path="/var/lib/kubelet/pods/1955414a-5a93-4595-8a14-31f8bc8f7df5/volumes" Jun 13 06:08:54 crc kubenswrapper[4894]: I0613 06:08:54.227328 4894 generic.go:334] "Generic (PLEG): container finished" podID="4a5c4b72-b727-4db0-b145-e14f5f9f3087" containerID="f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b" exitCode=0 Jun 13 06:08:54 crc kubenswrapper[4894]: I0613 06:08:54.227980 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" event={"ID":"4a5c4b72-b727-4db0-b145-e14f5f9f3087","Type":"ContainerDied","Data":"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b"} Jun 13 06:08:54 crc kubenswrapper[4894]: I0613 06:08:54.228887 4894 scope.go:117] "RemoveContainer" containerID="f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b" Jun 13 06:08:54 crc kubenswrapper[4894]: I0613 06:08:54.855056 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vp6sz_must-gather-8x9kr_4a5c4b72-b727-4db0-b145-e14f5f9f3087/gather/0.log" Jun 13 06:08:56 crc kubenswrapper[4894]: I0613 06:08:56.236484 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 06:08:56 crc kubenswrapper[4894]: I0613 06:08:56.236547 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.077950 4894 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/crc-debug-86szg"] Jun 13 06:09:02 crc kubenswrapper[4894]: E0613 06:09:02.079180 4894 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1955414a-5a93-4595-8a14-31f8bc8f7df5" containerName="container-00" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.079207 4894 state_mem.go:107] "Deleted CPUSet assignment" podUID="1955414a-5a93-4595-8a14-31f8bc8f7df5" containerName="container-00" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.079745 4894 memory_manager.go:354] "RemoveStaleState removing state" podUID="1955414a-5a93-4595-8a14-31f8bc8f7df5" containerName="container-00" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.080692 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.229718 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.229786 4894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddf2k\" (UniqueName: \"kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.331870 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.331921 4894 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddf2k\" (UniqueName: \"kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.332538 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.356595 4894 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddf2k\" (UniqueName: \"kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k\") pod \"crc-debug-86szg\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: I0613 06:09:02.413579 4894 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-86szg" Jun 13 06:09:02 crc kubenswrapper[4894]: W0613 06:09:02.444292 4894 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf97f698_0300_4962_a764_ec2c4c8c5a3c.slice/crio-c73cc585f11ef5315a6e6a667071046b805a88cab4d86b39f4bcc6674a04dd8b WatchSource:0}: Error finding container c73cc585f11ef5315a6e6a667071046b805a88cab4d86b39f4bcc6674a04dd8b: Status 404 returned error can't find the container with id c73cc585f11ef5315a6e6a667071046b805a88cab4d86b39f4bcc6674a04dd8b Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.290530 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vp6sz/must-gather-8x9kr"] Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.296105 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" podUID="4a5c4b72-b727-4db0-b145-e14f5f9f3087" containerName="copy" containerID="cri-o://f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc" gracePeriod=2 Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.302424 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vp6sz/must-gather-8x9kr"] Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.321486 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-86szg" event={"ID":"df97f698-0300-4962-a764-ec2c4c8c5a3c","Type":"ContainerStarted","Data":"d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d"} Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.321533 4894 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/crc-debug-86szg" event={"ID":"df97f698-0300-4962-a764-ec2c4c8c5a3c","Type":"ContainerStarted","Data":"c73cc585f11ef5315a6e6a667071046b805a88cab4d86b39f4bcc6674a04dd8b"} Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.344507 4894 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/crc-debug-86szg" podStartSLOduration=1.3444837729999999 podStartE2EDuration="1.344483773s" podCreationTimestamp="2025-06-13 06:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-13 06:09:03.336678462 +0000 UTC m=+4701.782925925" watchObservedRunningTime="2025-06-13 06:09:03.344483773 +0000 UTC m=+4701.790731246" Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.720582 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vp6sz_must-gather-8x9kr_4a5c4b72-b727-4db0-b145-e14f5f9f3087/copy/0.log" Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.721280 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.858363 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output\") pod \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.858494 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mb4d\" (UniqueName: \"kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d\") pod \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\" (UID: \"4a5c4b72-b727-4db0-b145-e14f5f9f3087\") " Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.864670 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d" (OuterVolumeSpecName: "kube-api-access-6mb4d") pod "4a5c4b72-b727-4db0-b145-e14f5f9f3087" (UID: "4a5c4b72-b727-4db0-b145-e14f5f9f3087"). InnerVolumeSpecName "kube-api-access-6mb4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:09:03 crc kubenswrapper[4894]: I0613 06:09:03.960596 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mb4d\" (UniqueName: \"kubernetes.io/projected/4a5c4b72-b727-4db0-b145-e14f5f9f3087-kube-api-access-6mb4d\") on node \"crc\" DevicePath \"\"" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.058314 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4a5c4b72-b727-4db0-b145-e14f5f9f3087" (UID: "4a5c4b72-b727-4db0-b145-e14f5f9f3087"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.063086 4894 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a5c4b72-b727-4db0-b145-e14f5f9f3087-must-gather-output\") on node \"crc\" DevicePath \"\"" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.286120 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5c4b72-b727-4db0-b145-e14f5f9f3087" path="/var/lib/kubelet/pods/4a5c4b72-b727-4db0-b145-e14f5f9f3087/volumes" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.329481 4894 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vp6sz_must-gather-8x9kr_4a5c4b72-b727-4db0-b145-e14f5f9f3087/copy/0.log" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.329839 4894 generic.go:334] "Generic (PLEG): container finished" podID="4a5c4b72-b727-4db0-b145-e14f5f9f3087" containerID="f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc" exitCode=143 Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.329878 4894 scope.go:117] "RemoveContainer" containerID="f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.329974 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6sz/must-gather-8x9kr" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.355847 4894 scope.go:117] "RemoveContainer" containerID="f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.439938 4894 scope.go:117] "RemoveContainer" containerID="f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc" Jun 13 06:09:04 crc kubenswrapper[4894]: E0613 06:09:04.444438 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc\": container with ID starting with f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc not found: ID does not exist" containerID="f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.444490 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc"} err="failed to get container status \"f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc\": rpc error: code = NotFound desc = could not find container \"f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc\": container with ID starting with f95efb16a5d2f4b53bdf204ea0cf3420ef476aebaf288f0a0e4dd1f6d15d58dc not found: ID does not exist" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.444524 4894 scope.go:117] "RemoveContainer" containerID="f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b" Jun 13 06:09:04 crc kubenswrapper[4894]: E0613 06:09:04.444868 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b\": container with ID starting with f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b not found: ID does not exist" containerID="f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b" Jun 13 06:09:04 crc kubenswrapper[4894]: I0613 06:09:04.444889 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b"} err="failed to get container status \"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b\": rpc error: code = NotFound desc = could not find container \"f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b\": container with ID starting with f91d6e7d4145696e6c1df8a22461bdf65208bc95659f736add746416a6c8d06b not found: ID does not exist" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.021372 4894 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/crc-debug-86szg"] Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.022153 4894 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/crc-debug-86szg" podUID="df97f698-0300-4962-a764-ec2c4c8c5a3c" containerName="container-00" containerID="cri-o://d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d" gracePeriod=2 Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.032780 4894 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/crc-debug-86szg"] Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.111553 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-86szg" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.263224 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host\") pod \"df97f698-0300-4962-a764-ec2c4c8c5a3c\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.263282 4894 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddf2k\" (UniqueName: \"kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k\") pod \"df97f698-0300-4962-a764-ec2c4c8c5a3c\" (UID: \"df97f698-0300-4962-a764-ec2c4c8c5a3c\") " Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.263359 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host" (OuterVolumeSpecName: "host") pod "df97f698-0300-4962-a764-ec2c4c8c5a3c" (UID: "df97f698-0300-4962-a764-ec2c4c8c5a3c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.263815 4894 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97f698-0300-4962-a764-ec2c4c8c5a3c-host\") on node \"crc\" DevicePath \"\"" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.270088 4894 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k" (OuterVolumeSpecName: "kube-api-access-ddf2k") pod "df97f698-0300-4962-a764-ec2c4c8c5a3c" (UID: "df97f698-0300-4962-a764-ec2c4c8c5a3c"). InnerVolumeSpecName "kube-api-access-ddf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.365716 4894 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddf2k\" (UniqueName: \"kubernetes.io/projected/df97f698-0300-4962-a764-ec2c4c8c5a3c-kube-api-access-ddf2k\") on node \"crc\" DevicePath \"\"" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.435534 4894 generic.go:334] "Generic (PLEG): container finished" podID="df97f698-0300-4962-a764-ec2c4c8c5a3c" containerID="d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d" exitCode=0 Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.435596 4894 scope.go:117] "RemoveContainer" containerID="d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.436141 4894 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/crc-debug-86szg" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.465590 4894 scope.go:117] "RemoveContainer" containerID="d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d" Jun 13 06:09:13 crc kubenswrapper[4894]: E0613 06:09:13.466015 4894 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d\": container with ID starting with d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d not found: ID does not exist" containerID="d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d" Jun 13 06:09:13 crc kubenswrapper[4894]: I0613 06:09:13.466053 4894 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d"} err="failed to get container status \"d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d\": rpc error: code = NotFound desc = could not find container \"d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d\": container with ID starting with d8f5f300a698f03e760506c381b2efdfaf74bf99df8bc6d8bdfa2860ed5dbb8d not found: ID does not exist" Jun 13 06:09:14 crc kubenswrapper[4894]: I0613 06:09:14.291574 4894 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df97f698-0300-4962-a764-ec2c4c8c5a3c" path="/var/lib/kubelet/pods/df97f698-0300-4962-a764-ec2c4c8c5a3c/volumes" Jun 13 06:09:26 crc kubenswrapper[4894]: I0613 06:09:26.236707 4894 patch_prober.go:28] interesting pod/machine-config-daemon-t6vz8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jun 13 06:09:26 crc kubenswrapper[4894]: I0613 06:09:26.237233 4894 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t6vz8" podUID="192fcf92-25d2-4664-bb9d-8857929dd084" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"